As of now I am installing node modules every time for the new angular project. Is it possible to use one projects node modules to other project by configuring any file(like changing the path in any file so it can use that modules)?
From Angular 6, you can generate multiple application in a single Angular project.
https://angular.io/cli/generate#application-command
applications generated by Angular cli command stay in same workspace and share node_modules.
For example, if you want to generate app my-project:
ng generate application my-project
If its dependencies are not different from previous one, you can use --skipInstall=true option with ng generate command.
And ng serve it with --project option:
ng serve --project=my-project
First off I was shocked to see that this is actually a thing that some people are doing - see: https://github.com/nodejs/help/issues/681
However I would advise against it.
The idea behind each project having its own node_modules folder (and package.json) is that each of your projects should specify its own dependencies (including specific versions) which is a good thing for stability, predictability, reproducibility, etc. of your various projects. Here's a pretty good write-up on the node dependency model: https://lexi-lambda.github.io/blog/2016/08/24/understanding-the-npm-dependency-model/
Now if you're talking about a local module (that you created yourself), you can have a look at https://docs.npmjs.com/cli/link.html
Related
I have a React Native app built using TypeScript, and I would like to also develop a number of CLI tools to help developers and 'back office' folks, also using TypeScript. I would like these to live in the same monorepo.
Based on advice from colleagues and my own research, I have tried doing this by creating a subfolder in the repo, and creating a second package.json (and all the other config files), and npm install-ing as if it were a completely separate project. It didn't take long for this to become a total mess, primarily with duplicate imports where one thing mysteriously seems to import modules from the other targets' node_modules, but also just remembering to re-npm install all the different subprojects before each commit, etc. It gets even more confusing with the TS build folders lying around; they're another place for people to import the wrong thing from. The confusion caused by this approach has been a significant drain on productivity, and it feels like there has to be a better way. So that's the question:
What is the best practice for building multiple TS/Node targets (and by "targets", I don't mean ES6 vs ESNext, I mean in the C/C++ sense: multiple output "programs", so in this case I want it to both create the bundle necessary for my RN app, but then also generate a CLI executable.) that all share code with one another, from a single monorepo?
If it matters, I am also using Expo.
You're essentially describing a monorepo. pnpm has fantastic tooling out of the box for this.
Download the pnpm CLI and install it:
$ npm i -g pnpm # Download pnpm
$ mkdir monorepo # Create a new monorepo folder
$ cd monorepo
$ mkdir packages # This will be the folder where you add your
# apps, libraries, etc.
$ touch package.json # Create a package.json
$ echo "{}" > package.json
Create a pnpm-workspace.yaml file and add the following:
packages:
- 'packages/**'
Congratulations. You now have a monorepo where you can add multiple apps. Here's how it works:
Every folder under packages that has a package.json file is now a Node app that you can control using pnpm from the root of your workspace.
When you run pnpm i from the root, it will install all of the dependencies for all of your apps and libraries.
You can even install libraries that you create locally without needing to run npm link, or deal with adding file:../ to your package.json or any of that mess.
pnpm also supports running scripts across all of your repos at the same time.
I made an example project for you to look at. In this example, you'll notice I created a library named my-library and a dependent app named awesome-app. When I ran
$ pwd
~/stackoverflow/packages/awesome-app
$ pnpm i my-library
pnpm knew to pick up my workspace library automatically:
{
"name": "awesome-app",
"dependencies": {
"my-library": "workspace:0.1.0"
}
}
Note the workspace:0.1.0 that matches the version of the package I have in my repo.
I love pnpm. It isn't without its faults, but I've been very productive with it. There are also other alternatives, such as Lerna, and npm and yarn support their own workspace protocol. I just find pnpm to work the best and has the least amount of ceremony.
Enjoy.
I am trying to figure out an effective way to bundle and distribute various dependencies (node modules and/or "client"-side scripts and framework like Angular) with my Electron App.
Although the basic approach of npm install module-name --save works well for development, it is not so good in the end when it comes to minimizing the size of your app and using minified resources at runtime. For instance, virtually all npm packages (including node modules) come with a lot of "extra baggage" like readmes, various versions of components (minified, not minified, ES2015, no-ES2015, etc). While these are great for development, all these files have absolutely no need to be included in the version you will be distributing.
Currently there seem to be 2 ways to sort of address the problem:
Electron Builder recommends using 2-file package.json system.
Any dependency that is used during development only should be npm-installed using --save-dev and then prunning should be used when building the app for distribution.
In that regard I have several questions:
I am not quite sure why there is a need for 2-file package.json system if one can install dev-only modules/ dependencies with --save-dev and then use pruning during the actual app build/compilation?
Regardless of which method above is used, you still end up with full npm packages in your app, inclduying all the miscellaneous/duplicated files that are not used by your app. So how does one "prune" so to speak the npm packages themselves so that only the actual files that are being used at run-time (like minified scripts) get included?
Will using Bower for "client-side" packages (like AngularJS 2, Bootstrap, jQuery, etc.) and using npm for node modules (like fs-extra) be a better option in as far as separation of concerns and ease of bundling later?
Could WebPack be used to produce only the needed files, at least for the "cient-side", so that only real node modules will be included with the app, while the rest of it will be in the form of web-pack compiled set of files?
Any practical tips on how this bundling of dependencies and distribution should be accredited out in practice? Gulp-scripts? Web-pack scripts? Project structure?
Thank you.
I am still in the learning curve of adopting the best practices in code deployment. But here is my starting list of what is recommended.
Yes, npm install --save-dev is the first easiest thing to isolate dev and build specific packages. This includes gulp/grunt/webpack and its loaders or additional packages. These are used only for building and never in the code that actually is run. All packages used by the app should be installed with npm install --save so that it is project level available. So, in production, you would no npm install --production in machines which will not install dev packages at all. See What's the difference between dependencies, devDependencies and peerDependencies in npm package.json file? for more info.
While the original recommendation was to use bower for client side and npm for server side, both can be installed using npm too. After all, both does the same job of managing the packages and dependencies. However, if web pack is used, it is recommended that npm is used for client side dependencies also.
package.json should be thought of managing the dependent packages only and not for building. For building and picking only the required files, you need task runners like gulp/grunt or bundlers like web pack.
While gulp/grunt is very popular for build automation which includes bundling all dependent javascript in file and minifying them in to one file, webpack/browserify is a better option as it supports module import. Module import is intuitive way of require one module in another in node js type of coding
var util = require('./myapp/lib/utils.js') This is powerful way of mentioning the required dependencies in the code. The web pack builder runs like gulp as build process. But instead of looking through html file for all js files, looks at starting js file and and determines all dependent code mentioned by the require statements recursively and packages accordingly. It also minifies the code. It also loads css and image files in one bundle to reduce server trips. If needed, some modules can be configured to be loaded at runtime dynamically further reducing page load. NPM vs. Bower vs. Browserify vs. Gulp vs. Grunt vs. Webpack discusses this at length.
Webpack can be used to bundle client side app optimally while server side need not be bundled or minified as there is no download.
In web pack, though you can mention dependent modules with lib file path, the recommendation is to npm install all dependencies and mention the module name. For example if you have installed jquery, instead of giving path like /libs/jquery.min.js, you can mention as 'jquery'. Webpack will automatically pull the jquery lib and dependencies and minimise it. If they are common modules, it will be chunked too. So, it is better to npm install dependent packages instead of bower install.
ES2015 provides lot of benefits during coding time including type checking and modules. However all browsers do not yet support the spec natively. So you need to transpile the code to older version that browsers understand. This is done by transpilers like Babel that can be run with gulp. Webpack has in-built babel loader so web pack understands ES2015. It is recommended to use ES2015 module system as it will soon become the defacto way of coding and since there is transpiler, there is no worry of this not being supported in IE8/9.
For project structure you could have
server
client
src containing js files
dist containing html and build files generated
webpack.dev.config.js and webpack.prod.config.js can be at root level.
I have found that this area is an ocean and different schools of best practices.This is probably one set of best practices. Feel free to choose the set that works for your scenario. Look forward for more comments to add to this set.
I am a beginner to mean.
I have followed the following steps to create meanjs app :
installed node v0.12.7
npm install -g bower
npm install -g grunt-cli
npm install -g yo
npm install -g generator-meanjs
cd C:\Users\SHIVAM\Desktop\MyApp
yo meanjs
MyApp folder containing mean directory created
Error occured on yo meanjs 1
Please provide a solution . I am struck at the first phase. I need to get started as sson as possible .
Version 0.4.* has different folder structure. The tutorial you are following is probably using earlier version of meanjs. You can find the new structure here-http://www.bossable.com/954/version-0-4-0/ She has compared the structure with the older one. I was going through the same problem. Follow these tutorials, it will be of great help.
The following directories have been changed in the 0.4.* versions of Meanjs; 0.4.2 is still unstable. 0.4.0 and 0.4.1 is recommended for beginners to get on MEAN.
-New files have been added
1) gulpfile.js-Gulp is relatively faster than grunt because gulp focuses on code rather than cofiguration. It uses node.js’ streams, and executes faster, since it does not open/close files, or create intermediary copies all the time. The lack of any up-front configuration, especially, specifying a source and destination is noticed immediately in this file.
2) protractor.conf.js-Support file for end-to-end testing in angularjs applications. (Pretty Neat.)
The app folder is removed, which is the major concern for the noobs trying their hands on MEANjs following the older tutorials. Controllers,views, models are now moved into a separate folder called modules which was previously in the public folder.
You can explore modules/core/server and modules/user/server of your project
Addition of client, server and test in modules/core and modules/user folder. All the client side angular will now be in the modules/core/client making it easier to access all services and directives of different modules of the application at one place. And all the backend or server-side files goes to modules/core/server.
public/application.js and public/config.js are now config.js and init.js in modules/core/client/app folder.
config folder is split and new folders assets and lib are added.
The previous env folder have been split into env and assets. All the references, mongodb port, session keys goes to the environment(env) whereas new libraries we are defining into our project goes to assets. For instance new css, sass or less reference will be defined into asset/default.js. config.js and init.js are merged into one file config/config.js.
lib contains express.js, mongoose.js and socket.io.js.
node_modules is pretty much exactly the same as the previous versions of meanjs.
These are the major changes in the folder structure. New models and controllers are added into the modules/users/server according to your application, just so you don't get lost while developing your project.
Having gone through and used demeteorizer. I wonder what are the main differences between setting up meteor vs demeteorizer and running it via node; on own server?
meteor only
hot swappable code?
problem in maintaining packages similar from production and dev
same meteor versions running on prod and dev
hardcoded environment settings (i.e. mongo)
demeteorizer
platform independant as this auto bundles dependancies and uses pure nodejs
organise and maintain mongodb how you like (backup scripts etc)
I have been using demeteorizer (packaging->upload->running forever), but wonder if there are any performance or issues in the long run.
I have seen packages such as "authentication" running okay locally but very slow on the test server (hangs on submit, indicating sync problems?)
thanks in advance.
ref: https://twitter.com/SachaGreif/status/424908644590030848
Demeteorizer builds on top of meteor bundle with one small difference: Demeteorizer builds a package.json for you and deletes the node_modules directories.
Without demeteorizer you would have a bit of trouble deploying your app, particularly if it was on a different platform to the one you built your app on.
If you see meteor's own docs, you have to remove fibers and manage your npm modules yourself, manually. With a package.json you can run npm install and have them all installed for you, including ones from packages.
Why is this useful? For services like modulus it means you can upload an app and have it install all your dependencies for you without you having to think about it, as if it were an ordinary node-js app.
Everything that applies to meteor bundle will also apply to demeteorizer as it is still the same meteor bundled app, just with the package.json. So you can use forever, hard coded/environment based settings, etc the same way.
I am new to Node.js programming and I have recently created a sample working web application using (express, backbone & other complimentary view technologies, with mongoDB). Now i am at a point where I want to deploy the same on a staging environment and I am not sure how to package this application and distribute the same. [I can take care of mongoDb and setting it up seperately]
I am from Java world and in there we create jars for reusable libs and war/ear packages for web applications which is deployed in a servlet container. Now in this case since node.js itself acts as a web container as well, how do i package my webapp?
Is there any standard format/guidelines of packaging node webapps built using express? (Is there a similar jar/war packaging systems for node apps?)
How do I deploy it once packaged? Would it become an exe, since it is also its own container?
PS: As of now I am thinking of just manually copying all the required source files into the staging environment and run npm commands to download all dependencies on that machine and then use 'forever' or some other mechanism to run my server.js. (Also, add some sort of monitoring, just in case app crashes and forever fails) I am not sure if that is the right way? I am sure there must be some standardized way of addressing this problem.
Deploying Node.js applications is very easy stuff. In maven, there is pom.xml. Related concept in Node.js is package.json. You can state your dependencies on package.json. You can also do environmental setup on package.json. For example, in dev environment you can say that
I want to run unit tests.
but in production;
I want to skip unit tests.
You have local repositories for maven under .m2 folder. In Node.js, there is node_modules folder under your Node.js project. You can see module folders with its name.
Let's come to the grunt part of this answer. Grunt is a task manager for your frontend assets, html, javascript, css. For example, before deployment you can minify html, css, javascript even images. You can also put grunt task run functions in package.json.
If you want to look at a sample application, you can find an example blog application here. Check folder structure and package.json for reference.
For deployment, I suggest you heroku deployment for startup applciations. You can find howto here. This is simple git based deployment.
On project running part, simply set your environment NODE_ENV=development and node app.js. Here app.js is in your project.
Here is relative concept for java and nodejs;
maven clean install => npm install
.m2 folder => node_modules(Under project folder)
mvn test => npm test(test section on package.json)
junit, powermock, ... => mocha, node-unit, ...
Spring MVC => Express.JS
pom.xml => package.json
import package => require('module_name')
There is no standardized way, but you're on the right track. If your package.json is up to date and well kept, you can just copy/zip/clone your app directory to the production system, excluding the node_modules.
On your production system, run
npm install to install your dependencies, npm test if you have tests and finally NODE_ENV=production node server.js
Some recent slides I considered to be quite helpful that also include the topic of wrappers like forever, can be found here.
Hope this might be helpful for somebody looking for the solution,Packaging of Node js apps can be done using "npm pack" command.It creates a zip file of your application which can be run in production/staging environment.
Is there any standard format/guidelines of packaging node webapps
built using express? (Is there a similar jar/war packaging systems for
node apps?)
Yes, the CommonJS Packages specification:
This specification describes the CommonJS package format for
distributing CommonJS programs and libraries. A CommonJS package is a
cohesive wrapping of a collection of modules, code and other assets
into a single form. It provides the basis for convenient delivery,
installation and management of CommonJS components.
For your next question:
2. How do I deploy it once packaged? Would it become an exe, since it is also its own container?
I second Hüseyin's suggestion to deploy on Heroku for production. For development and staging I use Node-Appliance with VirtualBox and Amazon EC2, respectively:
This program takes a Debian machine built by build-debian-cloud or
Debian-VirtualBox-Appliance and turns it into a Node.js "appliance",
capable of running a Node application deployed via git.
Your webapp will not become an exe.
few ways to approach this:
Push your code into Git repository, excluding everything that isn't your code (node_modules/**), then pull it in your staging environment, run npm install to restore all dependencies
create an NPM package out of it , install it via npm in your staging environment (this should also take care of all of the dependencies)
manual copy/ssh files to your staging environment (this can be automated with Grunt), than restore your dependencies via npm
I used zeit's pkg module. It can create cross platform deliverables for linux/win/macos. Actually used it in production and works fine without any issues.
It takes in all the js scripts and packages it into a single file.
The reason I used it is because it helps in securing your source code. That way in production at customers environment they will have access to application but not the source code.
Also one of the advantages is that at production environment, you do not actually need to have the customer install node.js as the node binaries also get packaged inside the build.
https://www.npmjs.com/package/pkg