I'm new to Node, coming from a Java background. These days I'm experimenting with each part of a full application: database, rest api, ui.
So far I wrote the database-backed logic, which runs on its own, processes text files, store data about them in the database and exposes a REST API to query that data. I'm now going to make the ui to navigate that data.
Would it be reasonable having a structure like this:
- (a) main project folder
- (b) backend application (a Restify server responds to REST calls querying the database)
- (c) ui application (an http server serves React static files)
If that makes sense, I would guess that:
(b) has a package.json with server- and rest- related dependencies (i.e. Restify, MongoDB, ...)
(c) has another package.json with dependencies for ui (i.e. React, Webpack, etc, but not Restify or MongoDB)
(a) has a third package.json which cares for installing each sub-project (I'd say by running npm install through hand-written npm-scripts).
Otherwise, how do you usually handle such Node projects? Do you keep each application completely separate from the rest?
For those who know that tool, this mimics a Maven multi-module project; though that level of automation is not needed, I'd just like to come up with a self-contained package.
These project structures are called as monorepos - A single node project repository that contains multiple packages. There are tools like Lerna. If you are using yarn as package manager, it comes with experimental feature of workspaces.
Related
I am creating my first app node express app with angular 7 on the frontend to be deployed in production. I have below question?
What folder structure is preferred, should I create separate
projects for node and angular or same project(server.js in the root
of angular project and server folder to create express server
files)? What is the preferred one and I have to checkin the project
in one folder of svn.
Should I use babel and create the node server code with es2015 or
continue with old approach?
Its all up to you, what I am doing is I have sepreate directory for Angular and Node projet
project
|
client - Your anguar project
server - Your Apis and server side coding (Only this folder require at productino level)
Then we can create a gulp file and task to gulp that Build my client
project and put that build folder inside the
server -> public
Now only server can be use to production where Build will be render as static.
And next to authentication and autherization process you can follow JWT based permission .
Generally I would say that separating your client and server code into separate projects is preferred so that you do not have to release both your client and server at the same time when you make a change to one or the other. The rest of my answer is based on the assumption that you would separate the two sides into different projects.
As far as structuring your server side Express-based application, check out this link for some guidance on how to handle your situation. See the answer to the first question about different approaches to how to structure your Express application for different deployment scenarios. Also, if you use the latest LTS version of node, you will not need to use a transpiler to convert your files to Javascript because the Node environment will handle that for you.
As far as structuring your client side Angular-based application, check out this link for a very detailed discussion about best practices for structuring your Angular application.
I would prefer following, in case in future you need to separate the API layer with client you can do it with ease,
project
|----client
| ---client-template //All UI code like .css/htmls and node process initiates from here
| ---client-angular // All the directives and controllers goes here
| ---client-service //Service layer, All the API call to server goes here
|----server
| ---server API's // separated by its own module if any
|--- you API modules and so on..
This will help you to have flexibility over client and server integration without any tight coupling. Also easy to maintain and debug.
Answer 1: you should make two separate folder/repository structure for frontend and backend.
let's suppose your application grows fast at that time you want to scale your backend and you want to host your Angular app as static web app using Amazon-S3 so at that time it will be very easy to manage this.
May you want to use CICD, in that case also it will be good if your separate folder so you can create separate CICD jobs for backend and frontend.
May be your company hired some developer which is either expert in frontend or in backend only. in that case your company don't want give them unnecessary code access. so separate repo will be an easy option for this case. (this may be Depends on your team and company's approach for development)
Answer 2: I recommend go for es6 or es6+ features.
latest node.js version is supporting some of the features of es6. for example
- spread operator
- destructing
- classes (you can use OOPs)
- arrow functions
- let, const
- async await and etc
you can use babel if any other feature which is not supported by node.js. there could be may reason for using babel, but i want to know which specific feature do you want to use with babel? so i can explain according to that.
I have used the following approach that bind the Angular Application and the Node server as a single unit.
Steps for creating the project structure is:
Create a new Angular project with the CLI.
Create a server.js file in the root directory of the project and configure it to render the contents of the dist/ folder on the / route.
You can refer the link for the server code: https://github.com/nikhilbaby/node-server
Running the server
I usually run the project with ng build && node server. This will make sure that the angular application is build first and after that node server is started.
I am working on a project that is hosted as a monorepo. For simplification purposes let's say that inside there are three self-explanatory packages: server, a webapp client and library. The directory structure would be something like the following:
the-project
packages
server
src
webapp
src
library
src
All packages employ flow type notation, use a few >ES5 features and, for this reason, go through babel transpilation. The key difference is that transpilation of the webapp package is done via webpack, whereas server employs a gulp task that triggers script transpilation through the gulp-babel package. library is transpiled automatically when web is built.
Now, the problem I have is that for server to build, babel requires library to be built first and its package.json to specify its (built) main JS source file so its transpiled artifacts can be included. As you can imagine, this would quickly become problematic if the project were to contain multiple libraries that are actively being developed (which it does), as all would require building, including any dependent packages (like server in this simple case).
As an attempt to overcome this annoyance, I initially thought of using webpack to build the server, which would take care of including whatever dependencies it requires into a bundle, but I ran into issues as apparently webpack is not meant to be used on node JS applications.
What strategies are available for building a node JS application requiring Babel transpilation, such that the application's source files as well as any dependencies are built transparently and contained in a single output directory?
Annex A
Simplified gulp task for transpilation of scripts, as employed by server.
return gulp
.src([`src/**/*.js`], { allowEmpty: true })
.pipe(babel({ sourceMap: true }))
.pipe(gulp.dest('dist'));
As can be seen above, only server's own source files are included in the task. If src were to be changed to also include library, the task would emit the dependencies' artifacts in server's own output directory and any require('library') statements within would attempt to locate the built artifacts in packages/library and not packages/server/dist, thus resulting in import failures.
First of all, I am not sure what your server is doing. If it is doing a database connection or some calculations then I would not recommend it to be built by webpack. Whereas If your server is just doing Server-Side Rendering and making some API calls to other servers then I would recommend it to be bundled using webpack.
A lot of projects follow this philosophy. For example, you can take a look at something similar, I have done in one of my personal projects [Blubus]. Specifically, you might be interested in webpack-server-config. And also you can take a look at how big projects like spectrum does it.
I've started to convert my project to typescript and in a meanwhile I decided to improve code reuse between node.js backend, react web client and react-native mobile client.
Here is my project structure:
├ commons (code shared between backend and frontend, mostly type definitions and utils
├ clients (code shared between web and mobile clients, depends on commons: containers, utils, forms... )
├ backend (express server, depends on commons)
├ web (create-react-app, not ejected, no SSR needed, depends on clients)
├ mobile (react-native client, depends on clients)
What I've tried so far:
Symlinks. Could't make them work with react-native (see metro bundler issue).
This can probably be solved by using haul, but adds extra complexity by making haul work with typescript. Also, haul doesn't seem to work with Mobile Center, which I use for publishing mobile apps
Using rootDirs or paths in tsconfig.json. Typescript compiler doesn't merge/bundle outputs, so this means I need to support 3 different solutions to merge generated code. Also doesn't work well with my IDE.
Using WML. I tried two approaches:
Linking commons and clients to packages inside node_modules of web/mobile/server. To do this, I have to generate declarations, which is burdening because it requires exposing all imports (see this issue). Also doesn't play well with yarn, which will remove linked package every time it installs something new.
Linking commons and clients to separate source folder inside source folders of web, mobile and backend. This is what I use in current JS version of my project. It works, but it has some downsides:
Long relative imports (probably can be solved by supporting 3 different solutions for module aliases)
wml sometimes breaks in backround, which leads to some confusing albeit easy to discover errors
doesn't work well with hot reloading
I'm looking for a solution which is not too complex (requires minimal configuration on web/mobile/backend sides) and plays well with Webstorm.
It probably doesn't exist now, so I'd like to hear what other solutions people here use for similar project setups.
In the similar situation to yours I did the following:
commons gets published as a separate npm package. You can make it private if necessary.
clients depends on commons package and gets published to npm as well. Again - private if required.
web and mobile projects both reuse npm packages created above.
I have a new project which will consist of two parts...a client - SPA using AngularJS and server side using Nodejs - MongoDB. Many articles recommend using Mean.io when developing similar projects, but couldn't find any information on why this is better than simply installing Nodejs, Mongo, AngularJS and using them.
So can someone please tell me the benefits of using Mean.io over installing Node, Mongo, Angular and Express and using them? OR in other words why it is better to use Mean.io rather than downloading and installing each package/ framework individually? Thanks
well, you are going to save a lot of time with plain simple boilerplate code, in my case I've been using meanjs which helps you with a lot of basic functions like:
login using passport, local strategies, social network strategies (g+, facebook, twitter)
Twitter bootstrap
Consistent folder structure
Consistent file naming
Environment configuration for dependencies and custom "settings" (dev, test, production)
pre configured routes with controllers
software development workflow:
grunt (preconfigures task like jshint, build, test)
yeoman generator (save you a lot of time)
nodemon (for reloading pages everytime you save a page)
testing frameworks for client and server side
I can list a lot of more but the point is to mention the benefits are far beyond just by putting all 4 main pieces of software together.
for more info you can look at the overview
I'm working on including Ember into an already deployed Node/Express/EJS application. I don't want to disrupt any of the existing application behavior, but instead, want build out any additional feature to the app using Ember. The server side code for these new features has already been built, and each endpoint returns the JSON format that Ember Data expects. I've been looking into Ember App Kit and Ember-cli, but I'm not sure how to include these tools into my existing directory structure, and I'm not certain if these are in face the right tools for my use case. Does anyone have any experience with this particular use case?
For example, navigating to /foo returns the existing express route that renders an ejs template, but /bar would be an Ember route that hits the api endpoint of the same name.
Use ember-cli (ember-cli.org). It's perfect for this situation as it allows you to rapidly prototype out your ember app. It even comes with an expressJS based testing suite and mocks server.
Once you are ready to incorporate it to your NodeJS, Flask, or whatever other application all static files should be available in the ember-cli dist directory.
Just don't forget to build the ember-cli project before porting via the means of ember build. After that it's just a simple matter of moving the files in the ember projects dist folder into wherever you need in your environment.
Just to embellish a bit: Ember-cli has a great work-flow for use while building your ember app. Try ember serve for a quick example. I mention this because it speaks to your question of how to incorporate this into your existing project (by project, I assume you may mean workflow). I typically will build ember projects purely using ember-cli and consider the back-end (usually a REST-API exposed via either Flask or NodeJS) a separate concern. When importing the app all I have to concern myself with is making sure my server serves the correct static dist files.
I would not recommend using the Ember App Kit (EAK) as it has been deprecated in favor of ember-cli. It really is.. much, much better.
Ok so I'm going to try to be more complete in this answer. Let's start with the isolated question - ember-cli or eak? Definitely Ember-Cli, but why?
EAK is officially heading for deprecation in favor of ember-cli.
Ember-cli produces more structured, cleaner, maintainable ember code.
Ember-cli integrates your entire ember-app workflow.
Managing all types of dependencies and assets is made simple via bower install --save and Brocfile.js edits. (See the ember-cli docus for explanation)
Now the more complicated part of the question. How do I integrate this with an existing workflow? I recently ran into this when building a webrtc-included ember app. It just so happened that this was my first real use of ember as well. So, not yet realizing the full potential of my new hammer I wrote the REST API, Backend ORM layer, the signalling service, the session cache, and built a complete CI workflow first. Then I was ready to build my ember app and ended up in your exact position.
To short circuit a long story - the lesson I learned was that I should treat my ember-cli app like a completely separate concern. What I mean here is - there's my backend (NodeJS, Apache, Nginx... whatever) and what I code here is built, tested and integrated separately. It normally even lives in its own git repository. It's a separate concern to my front end equation which typically would consist of several components itself. My I-Phone Native app would have its own workflow from build-to-test and integrate to my backend via a REST API. My Android native app another. My web app another. For all intents and purposes, in my workflow these are entirely separate workflows that only tie together when we start talking Continuous Integration.
There's a lot of arguments for why you'd want to do this. Most importantly - it scales.
The beauty of ember-cli is that it makes it fairly trivial to get a workflow for your ember app going and trivial to redeploy your app + workflow on a new box/instance. I would certainly recommend referring to the official ember-cli setup instructions but I'm going to include them here in case the URL goes bad one day:
No really, refer to the link my instructions will suck in comparison...
Deploying a new Ember App
Install NodeJS, NPM and Git (ember-cli will as a default load git going for you on new apps) on your system via sudo apt-get install nodejs and sudo apt-get install npm, sudo apt-get install git.
Note: On Ubuntu 14.04 and some other Debian systems use sudo apt-get install nodejs-legacy instead. If in doubt, use legacy. If you experience problems using the node command after install, it's definitely that you need to use nodejs-legacy. Don't bother trying to do the linking manually.
Install required node modules globally: sudo npm install -g ember-cli, sudo npm install -g bower, sudo npm install -g phantomjs
Create new ember-cli app: cd <Desired Directory>, ember new my-app-name
Now you can look at ember help to begin learning how to use ember-cli. Hint: The --dry-run flag is your friend. You'll notice that when you installed ember-cli all the scaffolding was taken care of for you. You'll see that you can add things with simple ember generate commands and they will not only create the required objects, but the test files as well. Best of all, using ember serve you can start scaffolding your app and via simple flags you can configure the test server to actually proxy and use your already-existing REST API (if you have one) or the expressJS mocks server to build a psuedo-API.
Integrating it with your larger workflow from here is a simple matter of configuring whatever tools you use (I use Jenkins and Ansible for this kind of stuff) to distribute the dist folder of ember-cli to where it should go to be served as static content (it is just a single page webapp in the end).
If you want to instead play with an existing ember-cli app that operates in an isolated workflow and already makes use of most of the goodies in order to get some familiarity - as I suspect you'll quickly realize how to fit this into whatever your current structure is - feel free to clone and play with this one here.
And so finally - to answer the more specific question of how this might fit into an existing directory structure, I would break this down into two categories. When we're talking src - I would have it in it's own "structure", separated at least by being in a separate sub-directory of its own. When we're talking built and deliverable I would include the contents of the /dist folder in whatever static web server directory you want to serve your ember app from.
EDIT: I Added some more detail - hopefully useful details below the line break. Let me know if you have more questions or if I can explain anything better.
I am facing a similar situation. I am planning to use EAK as a "prototyping tool" in a separate project folder. Then build the distribution directory from EAK using grunt dist and insert that into the assets folder of my main Node.js project.