Always feel stupid asking here because people are always confused with my questions, or I have a dumb problem, but, I'm working on a program in node.js and the text editor I'm using (NP++) doesn't seem to like to save files in the system32 directoy, (The directory where my modules are), and that is where my script is as well. (So I have .../.../node_modules/(modules) and .../.../node_modules/script.js) this becomes a pain when I want to edit the script, I have to clone the script to my desktop, then edit it, then overwrite the one in the node_modules directory. I tried saving the script to my desktop and running it, but it just gives me an error of module not found. (In my script I have the modules as var example = require('example.js')) Is there any way I can get it to get the modules from the node_modules directory, while keeping the script file somewhere easily accessible and editable? (i.e desktop?) (Sorry if this is confusing, not the best at these kind of things)
I'm not 100% sure that this is what's happening because I haven't used npm on Windows, but it sounds to me like you're installing your dependencies globally using npm -g. The more proper way to use Node is to install your dependencies locally, using npm without the -g flag. That way your dependencies get installed in your current working directory.
For example, let's say you've saved your project in a directory on your Desktop, and your script uses require("lodash"). If you cd to your directory and run npm install lodash, then the lodash module will be available to your script.
Related
I am quite new to programming and today decided to attempt and create a node.js and puppeteer project with the purpose of scraping website into a .txt file. I ran into issues straight away since for the most part I have no idea what I'm doing. After installing node.js and puppeteer, I was guided by some videos and articles I found to create my first project. In the command prompt using mkdir and later cd I was able to create and access the new directory, but I started running into problems with npm init. It only places the file package.json in the repository, but there isn't a package-lock or node_modules file anywhere. No idea what they do but thought this was a problem. When I open cmd and try to run the app by typing node app.js it returns Error: Cannot find module 'C:\Users\emili\app.js' along with some other gobble. What should I do, to be able to run the simple application I wrote?
It seems that you are missing some key knowledge on how NodeJS works, but in order to fix your issue (for now), you will need to take a few steps.
First, in your working directory (where the package.json is), you'll need to install your modules.
Run npm install puppeteer. This will do two things, create the node_modules folder and create the package-lock.json file.
Create a file named app.js (either manually or by running the command touch app.js) in your working directory, and put the following content inside of it:
console.log('Hello, World!');
Save the changes to app.js and then run node app.js in your terminal. You should see Hello, World! output to the terminal.
The reason npm install puppeteer created the node_modules folder and the package-lock.json file is because they weren't needed beforehand.
When you run npm install PACKAGE_NAME, you're installing a module (otherwise known as a package), thus it creates the node_modules folder so that it will have a place to put the module so that your code can access it. It also creates the package-lock.json file, which is used to track the module versions inside of your project.
With this information, I request you go back to the tutorial you were originally following and try going through it again and attempting to understand each of the core concepts before writing any real code.
I'm working on a Javascript project, and as it so happens one of my dependencies pulls in puppeteer, which in turn downloads a whole copy of Chromium into my node_modules. My larger project is split into multiple Javascript packages, so I end up with multiple identical copies of Chromium among other stuff.
Is there a way to deduplicate these packages system wide? Note, npm dedupe seems to do something completely different to what I want.
I imagine there would be a module repository in my home directory which contains every package I need (in every version needed), and then in the local node_modules directories would contain only symlinks to the repository. This seems like an incredibly obvious optimisation, but I can't find any way to do it in npm. If not in npm, is it maybe possible in yarn?
As an added complication, this should also work on Windows (where symbolic link support has historically been not so good).
It seems the following command does what I want:
npm config set link -g
Then delete node_modules, and do npm install again. It should be much smaller now.
The documentation says:
If true, then local installs will link if there is a suitable globally installed package.
Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:
The package is not already installed globally, or
the globally installed version is identical to the version that is being installed locally.
I am not sure if this has any negative side-effects - for example clobbering the global namespace with commands I don't want. For now, it seems to work fine.
My problem is a little complicated but I'll try:
I have a node.js application that needs to be completely prebuilt and bundled alongside standalone node.js (specifically 4.4.5 LTS), zipped and deployed to offline CentOS 6/7 machines, meaning I cannot do npm install, and no gcc/g++/python so I cannot do things like node-gyp rebuild.
Everything is working correctly except this module: ibm_db.
It's compiled with node-gyp after downloading the db2 cli drivers, but basically it's supposed to work like the regular DB2 client except all its dynamic libraries, binaries etc. are inside the module path itself (node_modules/ibm_db/installer/clidriver).
If I deploy the bundle (which includes all node modules in the tarball, including ibm_db) to another machine, it's probably going to sit on a different path from the machine on which I built the bundle. When I try to run the app like this: ./node app.js (here node is a symlink to the standalone node binary inside the unpacked bundle) i get this error:
Error: libdb2.so.1: cannot open shared object file: No such file or directory
Now, I can clearly see that libdb2.so.1 is inside node_modules/ibm_db/installer/clidriver/lib but the paths in bindings.gyp all use the original paths on the build machine which don't match, so I assume this is where the problem lies.
I can easily just add that path with ldconfig and it would work, however the user profile installing the app will not have superuser access so it's not a real option.
I tried setting the environment variable LD_LIBRARY_PATH but node.js deletes this entry from process.env on startup, and even if I programmatically set it like process.env.LD_LIBRARY_PATH='...'; it doesn't seem to do anything.
Is there any way to modify the library path for a compiled module without recompiling/rebuilding it? If it's possible I would assume that would be the easiest solution, but I couldn't find a way to do it.
In a Node package.json file, you can map multiple executables to the PATH environmental variable on a global NPM install (npm install -g):
"bin": {
"foo": "./bin/foo.js",
"bar": "./bin/bar.js"
},
I have a unique project that requires mapping existing PATH variables on Operating Systems that do not have it. For example, I want to add a command named grep to PATH, if and only if it is being installed on a Windows computer. If the computer is running any other OS, the NPM installation will obviously fail.
Is there any way to run logic that pre-determines what bin options are available in the installation?
Oh snap - I just had an idea!
Would this work:
Parent module has npm (programmatic version) as a dependency.
On global installation, run a post-install script as declared in the package.json of parent module.
Post-install script does a check on the system to see which commands exist. This would be more mature than "Windows or not Windows" - it would try to exec a list of commands and see which ones fail.
For every command that doesn't exist, post-install script programmatically runs npm install -g on all sub-modules (one for each command, such as grep).
This would take a while and the npm module is huge, but it seems like it would work. No?
There doesn't seem to be a way to do this directly through package.json, but it might be possible (and desirable) to do something like:
Make a separate npm module for each executable you want to register (eg my-win-grep).
In the my-win-grep module, implement the executable code you want to run, and register the PATH/executable value in this module.
In the package.json for my-win-grep, include an os field that limits it to installing on windows.
In your original module, list my-win-grep as an optionalDependency.
In this way, if someone installs your original module on Windows, it would install my-win-grep, which would register an executable to run under the grep command.
For users on other systems, the my-win-grep module would not install because of the os requirement, but the main module would continue to install because it ignores failures under optionalDependencies. So the original grep executable would remain untouched.
Updated question
This approach does sound like it should work - as you say, the npm dependency is pretty large, but it does avoid having to preform additional symlinking, and still has the benefit outlined above of having each piece of OS specific functionality in a separate module.
The only thing to watch for, possibly, in the programmatic npm docs:
You cannot set configs individually for any single npm function at
this time. Since npm is a singleton, any call to npm.config.set will
change the value for all npm commands in that process
So this might just mean that you can't specify -g on your installs, but instead would have to set it globally before the first install. This shouldn't be a problem, but you'll probably need to test it out to find out exactly.
Lastly...
You might also want to have a look at https://github.com/lastboy/package-script - even if you don't use it, it might give you some inspiration for your implementation.
I've been using Node.js and npm for a few weeks with great success and have started to question the best practice for installing local modules. I understand the Global vs Local argument, however, my question has more to do with where to place a local install. Let's say that I have a project located at ~/ProjectA/ which is version controlled and worked on by multiple developers. When initially playing with Node.js and npm I wasn't aware of the default local installation paths and just simply installed the necessary modules in a default terminal which resulted in a installation path of ~/node_modules. What this ended up doing is requiring all the other developers working on the project to install the modules on their own machines in order to run the application. Having seen where some of the developers ran npm install I'm still really surprised that it worked on their machines at all (I guess it relates to how Node.js and require() looks for modules), but needless to say, it worked.
Now that the project is getting past the "toying around" stage, I would like to setup the project folder correctly. So, my question is, should the modules be installed at ~/ProjectA/node_modules and therefore be part of the version controlled project files, or should it continue to be located at a developer-machine specific location...or does it not really matter at all?
I'm just looking for a little "best-practice" guidance on this one and what others do when setting up your projects.
I think that the "best practice" here is to keep the dependencies within the project folder.
Almostly all Node projects I've seen so far (I'm a Node developer has about 8 months now) do that.
You don't need to version control the dependencies. That's how I manage my Node projects:
Keep the versions locked in the package.json file, so everyone gets the same working version, or use the npm shrinkwrap command in your project root.
Add the node_modules folder to your VCS ignore file (I use git, so mine is .gitignore)
Be happy, you're done!