how to auto install npm packages on webpack - node.js

I want to automate some tasks that I do all the time and
so I have two questions:
can I have all my npm packages point to a cache location (in package.json)
so I don't have to download if I already have, while I develop (also to save space), but change it
back to component names when I deploy?
I wanted to do something with webpack where if you type 'require' in the
js files, it'll automatically "install" (with #1 method) the packages.
Can these be done? I want to automate lots of stuff that I'm doing and feel like this is a good starting point. Thanks in advance.

So, basically:
As Mike 'Pomax' Kamermans and Louy said, you don't need to care about the 're-downloads' of npm, since it already cares about it behind the scenes.
I stumbled upon the auto-install-webpack-plugin today while looking for something similar and it seems to be what you're describing. I haven't tested it yet, though.
I can edit this answer after testing it, but let me know if you do it first.

Related

create scafolding generators like react/open-wc using node

I am trying to create a project structure to my team like how it implemented by open-wc or create-react-app
just say npm init #open-wc and it asks couple of questions and creates the folder with specified configurations.
I didn't find good articles on google except exploring the github projects.
Maintainer of open-wc here :)
So to get an npm init script all you need to do is define a bin in your package.json.
This is what we use for npm init #open-wc:
"name": "#open-wc/create",
"bin": {
"create-open-wc": "./dist/create.js"
},
So then for the name you have 2 choices:
create-foo will be available via npm init foo
#foo/create will be available via npm init #foo
The generators itself
That's a rather sad story... we looked around but we didn't find anything that really fit our use case well. There is http://yeoman.io/ which we used initially but it's huge and it meant we had a bootup time of ~30-40 seconds before the menu appeared. We felt we needed to do something so now we roll our own solution.
It covers what we need now with a fraction of the size (especially by being very careful with dependencies) which reduced our bootup time to about ~5-10 seconds.
We thought about promoting it as a separate standalone project but truth be told we don't have the manpower for it. It's just 4 files you can find here https://github.com/open-wc/open-wc/tree/master/packages/create/src - beware as there is no docu and quite some rough edges.
Still, if you don't find a better solution feel free to join us and with some help, we could make it a separate product.

electron - incremental updating?

I am using electron-vue & electron-packager.
I am wondering whether I can do something like incremental updating, that is, after running an electron build command, I don't need to copy the whole electron-linux-x64 folder to my dist machine to update it to the newest, but instead I only need to copy some files in the folder.
Here is what I found up to now: I edit some code for the renderer process. Then I let electron-packager to build a package for linux. Then I find that not all the generated files have been changed. Instead, it seems that only the resources/*.asar have been changed. If I just copy these files to the dist machine, it seems that the machine updates well. But I am not sure whether some hidden files are changed too.
I would appreciate it if anyone could help me!
Since there are some upvotes to this question, and after three years I have gained more knowledge let me answer myself, making whoever reads this post can find a solution :)
Firstly, in 2020 there may already have solutions. For instance, try this and this.
Secondly, you can also use rsync to only copy the changed parts in a folder. Moreover, if a big file (say 10GB) only changes a little bit in the middle (say 1MB), it will only transfer that little bit (say 1MB). This is a general tool and can be used everywhere.
Lastly, as a side remark, manually copy your file to the development server is not a good idea. Try to automate this process. The simplest would be a several-line bash script using scp/rsync and so on, and the most complex may be Kubernetes and Docker.

How can I list all npm packages with above X downloads?

I'm looking for some API that allows you to request "names of all npm packages with above X downloads". A curl command to do this would be great but if someone knows where I can get the data I could write my own script for it.
A similar question has been asked before but nobody seems to have a nice answer for it. Most depended packages is easier if that also fits your need.
You can also use https://github.com/npm/registry/blob/master/docs/download-counts.md to query for stats for individual packages, though this probably isn't efficient enough.

How to tell Node.JS to use modules from global by default?

Is there any chance to tell Node.Js to also look in the global modules folder by default, without changing sources?
I am trying to avoid that my project folders (up to a hundred packages) gets messed with thousands of sub-folders (also, it slows most IDEs into their knees too). I am aware about the npm link trick but it doesn't work on all platforms or its causing other problems. Also, npm/npm3 is sometimes so slow that i have to wait an entire day that my project is ready for actually working on it (i have a top speed computer and broadband).
known solutions:
changing NODE_PATH environment is out for some other reasons, shell .rc changes are little bad too.
changing core files is easy but requires also patches in many other places (when using nodejs. as dependency for instance )
patching node.js's require function as in other versions like require-js which supports require({cache:{}}) or require({config:{}})
At the end I went with https://github.com/h2non/requireg. It doesn't need any kiddie hacks like npm link or special environment variables and works fantastic. It comes with a globalize function which makes subsequent require calls also looking in the global folders.

Is there a generic way to consume my dependency's grunt build process?

Let's say I have a project where I want to use Lo-Dash and jQuery, but I don't need all of the features.
Sure, both these projects have build tools so I can compile exactly the versions I need to save valuable bandwidth and parsing time, but I think it's quite uncomfortable and ugly to install both of them locally, generate my versions and then check them it into my repository.
Much rather I'd like to integrate their grunt process into my own and create custom builds on the go, which would be much more maintainable.
The Lo-Dash team offers this functionality with a dedicated cli and even wraps it with a grunt task. That's very nice indeed, but I want a generic solution for this problem, as it shouldn't be necessary to have every package author replicate this.
I tried to achieve this somehow with grunt-shell hackery, but as far as I know it's not possible to devDependencies more than one level deep, which makes it impossible even more ugly to execute the required grunt tasks.
So what's your take on this, or should I just move this over to the 0.5.0 discussion of grunt?
What you ask assumes that the package has:
A dependency on Grunt to build a distribution; most popular libraries have this, but some of the less common ones may still use shell scripts or the npm run command for general minification/compression.
Some way of generating a custom build in the first place with a dedicated tool like Modernizr or Lo-Dash has.
You could perhaps substitute number 2 with a generic one that parses both your source code and the library code and uses code coverage to eliminate unnecessary functions from the library. This is already being developed (see goldmine), however I can't make any claims about how good that is because I haven't used it.
Also, I'm not sure how that would work in a AMD context where there are a lot of interconnected dependencies; ideally you'd be able to run the r.js optimiser and get an almond build for production, and then filter that for unnecessary functions (most likely Istanbul, would then have to make sure that the filtered script passed all your unit/integration tests). Not sure how that would end up looking but it'd be pretty cool if that could happen. :-)
However, there is a task especially for running Grunt tasks from 'sub-gruntfiles' that you might like to have a look at: grunt-subgrunt.

Resources