Using same EC2 instance for node and mongo hosting - node.js

Can I use my aws EC2 instance to run both node server and mongo server by installing mongo driver into my EC2 instance ?
Its gotta be just like how we do it in our localhost, right?

I'd rather advise you to separate concerns and have dedicated instances for nodejs and for mongodb. It's not only good practice to separate app and db but also you'll able to scale much easier, and you'll scale exactly what you need.

Related

AWS multiple services inside one EC2 instance for great speed?

Since I used separate hosting for the database and node.js server, the speed would not be good. If everything is on one machine, the local data exchange will be faster. How can I run AWS services on one instance (node.js, redis, mongodb). In production It is not recommended to run the database together with the server. Is it possible to fine-tune AWS to ensure the same speed between the databases and the server as on a single computer?
Please Help me do not spare your advice!

Scaling a two-tier MEAN application with Docker on AWS

So my current set-up is a Node.js application that serves up an Angular front-end, a second Node.js application that has Express and serves as an API, and a MongoDB instance. Simply, the client-side app talks to the back-end app and the back-end app talks to MongoDB.
I was looking into how to Dockerize these applications and it seems like some examples use linking. So my question is does linking only work on the same host (meaning a single EC2 instance on AWS) or multiple EC2 instances? If only the former and if I have both apps and Mongo containerized on one instance, how do I scale out? Like if I spin up a second EC2 instance, would I put both containerized Node apps and Mongo again on that second instance? Is having a Mongo container on the same instance with the Node apps a single point of failure? How is that fault tolerant?
Just trying to wrap my head around this and apologize for my ignorance on the subject. Thanks!
You should put each app as well as the MongoDB server in separate containers (which is what I think you intend) and the linking (via Docker-Compose or other method) is just networking. If you use Docker links, it creates a private network. You can create other networks to talk to each other, also to a LAN, WAN, whatever.
Yes, putting them all on the same EC2 instance is creating a SPOF.
If that's a concern, look into: https://docs.docker.com/swarm/networking/
Docker Swarm is fully compatible with Docker’s networking features.
This includes the multi-host networking feature which allows creation
of custom container networks that span multiple Docker hosts.
Or load balance your apps and use an AWS-hosted MongoDB cluster. There are many possible approaches based on your needs and budget.

Using MongoDB with AWS ElasticBean application

I have an ElasticBean application running (setup with NodeJS) and I wondered what the best way to integrate MongoDB would be. As of now, all I've done is ssh into my eb instance (with the eb cli) and install mongodb. My main worry comes from the fact that the mongo db exists in my instance. As I understand it, that means that my data will most certainly be lost as soon as I terminate my instance. Is that correct? If so the what is the best way to go about hooking en EB app to a MongoDB? Does AWS support that natively without having to go rent a DB on a dedicated server?
You definitely do NOT want to install MongoDB on an Elastic Beanstalk instance.
You have a few options for running MongoDB on AWS. You can install it yourself on some EC2 servers (NOT Elastic Beanstalk servers) and handle all the management of that yourself. The other option is to use mLab (previously MongoLab) which is a managed MongoDB as a Service provider that works on AWS as well as other cloud services. Using mLab you can easily provision a MongoDB database in the same AWS region as your Elastic Beanstalk servers.
Given the steps involved in setting up a highly-available MongoDB cluster on AWS I generally recommend using mLab instead of trying to handle it yourself. Just know that the free-tier is not very performant and you will want to upgrade to one of the paid plans for your production database.
Been there, done that. As #MarkB suggested it´d be a lot easier to use a SaaS instead.
AWS by itself doesn´t have a native MongoDB support but deppending of your requierements you could find a solution with little or no extra cost (beside EC2 price) on Amazon Marketplace. These images are vendor´s pre-configured production-ready AMIs of popular tools like MongoDB.

Server segregation of nodejs and mongo in amazon

Why there are single web service just for mongodb? Unlike LAMP, I will just install everything on my ec2. So now I'm deploying MEAN stack, should I seperate mongodb and my node server? I'm confused. I don't see any limitation mixing node with mongod under one single instance, I can use tools like mongolab as well.
Ultimately it depends how much load you expect your application to have and whether or not you care about redundancy.
With mongo and node you can install everything on one instance. When you start scaling the first separation is to separate the application from the database. Often its easier to set everything up that way especially if you know you will have the load to require it.

nodejs, docker, nginx and amazon aws deployment

There have been many questions regarding docker, node and amazon aws and I have read most of them but I haven't got my answer.
I have been working on a production node.js API project for last some weeks and now that the API's are complete I have to deploy them.
There are a total of 2 microservices (this may increase later) and some worker processes. Different components of the system will communicate with each other using SQS and SNS. Each of the microservices uses mongo DB as the nosql storage and mongoose as the ODM. I chose mongolab as the mongoDB hosting provider. Currently I can connect to mongolab DB using MONGOLAB_URI environment variable (obviously this will not be enough during production any suggestion on this one is welcome)
I am going ahead with amazon aws platform.
My thought process is:
I will docerize each of the components. For worker processes it is straight forward.
For microservices I will have 2 docker images which I will deploy using amazon EC2 container service. I will have a third nginx docker image which I will put in front of node applications.
I am planning to create a cluster of 2 machines (c2 large) initially and host these 3 dockerized microservices and nginx images on these machines.
Obviously the node process will run on some port. Lets assume it to be 3100
Till now it is perfectly clear the problem came when I want to exposes these API's to outer world
The microservices exposes some endpoints like
service 1:- /users, /login, /me etc
service 2: /offers, /gifts etc
My Question is:
I want to resolve
mydomain.com/api/v1/users to service1:3100/users
similarly for other API's
I assume this can be done by nginx but I am not much familiar with it.
The constraints are:
I don't want to host each of the microservice on a separate machine (budget constraint).
I don't know which service will run on which machine (this I assume since I read that ec2 container service will automatically start docker processes on random machines and distribute load).
How Can I do this ?

Resources