nodejs, docker, nginx and amazon aws deployment - node.js

There have been many questions regarding docker, node and amazon aws and I have read most of them but I haven't got my answer.
I have been working on a production node.js API project for last some weeks and now that the API's are complete I have to deploy them.
There are a total of 2 microservices (this may increase later) and some worker processes. Different components of the system will communicate with each other using SQS and SNS. Each of the microservices uses mongo DB as the nosql storage and mongoose as the ODM. I chose mongolab as the mongoDB hosting provider. Currently I can connect to mongolab DB using MONGOLAB_URI environment variable (obviously this will not be enough during production any suggestion on this one is welcome)
I am going ahead with amazon aws platform.
My thought process is:
I will docerize each of the components. For worker processes it is straight forward.
For microservices I will have 2 docker images which I will deploy using amazon EC2 container service. I will have a third nginx docker image which I will put in front of node applications.
I am planning to create a cluster of 2 machines (c2 large) initially and host these 3 dockerized microservices and nginx images on these machines.
Obviously the node process will run on some port. Lets assume it to be 3100
Till now it is perfectly clear the problem came when I want to exposes these API's to outer world
The microservices exposes some endpoints like
service 1:- /users, /login, /me etc
service 2: /offers, /gifts etc
My Question is:
I want to resolve
mydomain.com/api/v1/users to service1:3100/users
similarly for other API's
I assume this can be done by nginx but I am not much familiar with it.
The constraints are:
I don't want to host each of the microservice on a separate machine (budget constraint).
I don't know which service will run on which machine (this I assume since I read that ec2 container service will automatically start docker processes on random machines and distribute load).
How Can I do this ?

Related

Deploy React, Node/Express, and MySQL web application onto the web with Amazon AWS

I have lately been learning React and Node js and have been having a lot of fun. I am wanting to deploy / publish my work onto the internet using Amazon AWS (to share my app with friends and potential employers). However, I am having trouble.
I have researched this process quite a bit and can't seem to find any resources that are detailed enough and that touch each aspect of my web app. To clarify, my front end is written in React and my back end is in Node/Express js, and that is connected to a MySQL database. Currently all of this is stored locally on my computer.
I'm not sure how to proceed...I've played around with adding all my code to github and running through the deploy feature on AWS Amplify, but that seemed limited to the react front end (at least I could not discover anything about including the functionality of my back end and including my database connection AND queries from Node).
Can anyone point me in the right direction? Providing any tips, suggestions, and/or resources to aid in this process would be appreciated. Specifically to this question: How to I deploy my React, Node/Express, and MySQL web app (currently stored completely on my local computer) to the internet through Amazon AWS?
To generify the question a little: you want to deploy a frontend, a backend and a database to AWS.
(Un)fortunately, there are lots of different options for this. Let's explore a little.
Frontend
Assuming that your frontend is a set of static resources (html/js/css), you don't need much more than a web server. You can use either S3 (an object store that can also serve web sites), or Cloudfront (a content delivery network), or run a virtual machine on EC2, install a web server there and deploy your frontend there.
Backend
Lots of options here. You can package your app in a Docker container and use ECS (container service or EKS (kubernetes service). You could also run your backend on Elastic beanstalk (comparable to Heroku). Or, run a virtual machine on EC2 and deploy your backend there.
Database
You can choose between a managed/hosted database like RDS, or roll your own by installing it on a virtual machine and installing a database server there.
So, what to pick? It depends on what you're comfortable with. If you have a bit experience with managing Linux servers, you could start an EC2 instance, install a web server like nginx or apache, install NodeJS, install MySQL and then copy your frontend, backend and database scripts/backup to the server.
If you're not comfortable with managing Linux servers, you could go for hosted/managed solutions like S3, Elastic beanstalk and RDS.
Do that, that when your frontend is running on a different domain/url than your backend, your backend needs to set CORS headers otherwise the browser won't allow your frontend to make HTTP requests to your backend.
Hope this helps - good luck!
Elastic Beanstalk (EB) could be a good start, as it can provision all resources that are needed for node.js applications, without much knowledge required about setting up and managing everything from scratch:
Deploying Node.js applications to Elastic Beanstalk.
For simplicity you can start with single instance environment type (no load balancer) and see how it goes.
EB can also setup a database for you:
Adding a database to your Elastic Beanstalk environment
If you get more comfortable about working with AWS, you can scale up into load balanced EB environment, or look at other options, such as your own EC2 instances with autoscaling groups, load balancers, container services and more.

Deploy a nodejs server for web CRM for production

I'm looking for any method or tutorial to deploy on local server/computer a nodejs express app. this will be for production environment. All I read about solutions like zenit now, localtunnel, forever, pm2 and similars is that they aren't recomended for production environments. The idea is to have a public web without hosting. I eed that the method allows to maintain more than one node/web active at the same time.
When people say a component is not recommended for production, it does not mean that it is not stable. Most of the times it means that it is not a full blow solution that considers all the aspects of a production deployment:
scalability
fail-over
security
configurability
automation
etc.
If you are trying to build a solution that has precise requirements (requests per seconds, media streaming, etc.) you should post in your question as well to make it concrete. If this is not the case, you just have to install a basic setup that runs your configuration and fix bottlenecks as they appear. Don't try to build a theoretically correct solution now.
A couple of examples:
A classical setup (goes well with Do-It-Yourself deployments)
install Git + (Node.js and NPM) + (Forever or equivalent) + your database (e.g. MongoDB) + (NGINX or HAProxy) on your favourite/accepted Linux distribution
clone each Node.js app in its own directory
install cronjobs for basic monitoring and maintenance
add scripts to dynamically remove/add NGINX web server configurations based on deleted/added Node.js apps
A more modern setup (goes well with AWS/GCE deployments but also possible locally with tools like skaffold)
install a Kubernetes cluster on a couple of machines
prepare a base Docker container image that matches all your Node.js applications
if required, add a Dockerfile to each Node.js application to build one Docker image per application based on the base Docker container image
add a new deployment for each of your Node.js application
Kubernetes will do for you the "keep-alive"
fill-in the plumbing between your server network (DNS, IP, ports) and the IP's provided to you by Kubernetes (NGINX or HAProxy would also fill in this hole)

Deploy node.js in production

What are the best practices for deploying a nodejs application in production?
I would like to know how deploy for production Api's nodejs is being done today, today my application is in docker and running locally.
I wonder if I should use a Nginx inside the container and deploy my server on it or just upload my image node that is already running today.
*I need load balance
There are few main types of deployment that are popular today.
Using platform as a service like Heroku
Using a VPS like AWS, Digital Ocean etc.
Using a dedicated server
This list is in the order of growing difficulty and control. So it's easiest with PaaS but you get more control with a dedicated server - thought it gets significantly more difficult, especially when you need to scale out and build clusters.
See this answer for more details on how to install Node on a VPS or a dedicated server:
how to run node js on dedicated server?
I can only add from experience on AWS using a NAT Gateway which is a dedicated Node server with a MongoDB server behind the gateway. (Obviously this is a scalable system and project.)
With or without Docker, you need to control the production environment. This means clearly defining which NPM libraries you will need for production, how you handle environment variables and clusters for cores.
I would suggest, very strongly, using a tool like PM2 to handle clusters, server shutdowns and restarts and logs. (Workers & slaves also if you need them and code for them).
This list can go on and on, but keep in mind this is only from an AWS perspective. Setting up a Gateway correctly on AWS is also not an easy process. Be prepared for some gotcha's along the way.

Scaling a two-tier MEAN application with Docker on AWS

So my current set-up is a Node.js application that serves up an Angular front-end, a second Node.js application that has Express and serves as an API, and a MongoDB instance. Simply, the client-side app talks to the back-end app and the back-end app talks to MongoDB.
I was looking into how to Dockerize these applications and it seems like some examples use linking. So my question is does linking only work on the same host (meaning a single EC2 instance on AWS) or multiple EC2 instances? If only the former and if I have both apps and Mongo containerized on one instance, how do I scale out? Like if I spin up a second EC2 instance, would I put both containerized Node apps and Mongo again on that second instance? Is having a Mongo container on the same instance with the Node apps a single point of failure? How is that fault tolerant?
Just trying to wrap my head around this and apologize for my ignorance on the subject. Thanks!
You should put each app as well as the MongoDB server in separate containers (which is what I think you intend) and the linking (via Docker-Compose or other method) is just networking. If you use Docker links, it creates a private network. You can create other networks to talk to each other, also to a LAN, WAN, whatever.
Yes, putting them all on the same EC2 instance is creating a SPOF.
If that's a concern, look into: https://docs.docker.com/swarm/networking/
Docker Swarm is fully compatible with Docker’s networking features.
This includes the multi-host networking feature which allows creation
of custom container networks that span multiple Docker hosts.
Or load balance your apps and use an AWS-hosted MongoDB cluster. There are many possible approaches based on your needs and budget.

RESTful API 2x nodejs apps on same server, with fallback

Micro Services I would like to have front-end-web and back-end-api nodejs applications running. Communicating via RESTful HTTP APIs, on a single machine (read ec2).
Stateless I would like to scale these horizontally across ec2 instances in the future. Using Redis (ElasiCache) and MySQL (RDS) (read stateless)
Load Balanced When scaled i would load balance with an ELB. no problme there.
QUESTION: If a back-end-api goes down on a machine, is it possible to somehow fallback to another ec2 server running an back-end-api instance? How would i do this.
Why not seperate the API app? well i would like to keep them on the same server for latency and maintainability.
oh btw i use docker :-)

Resources