Node.js app on AWS for beginner - node.js

I started learning node.js a couple weeks ago and just finished my first small project, a basic live chat website using socket.io and express. The structure for my project looks like this:
ChatApp
|
|____backend.js // node server side code
|
|____ static
| |
| |_____ libs
| |
| |___ app.js // front end logic
| |
| |___ jquery.min.js
|____ views
|
|_____ index.html // Client website
My goal right now is to learn how to use AWS to make my application available so people on different machines can talk to one another, not just me on my local server. I tried following this guide which uses Elastic Beanstalk to deploy a sample repository, but I'm having a hard time seeing how to translate it to my folder structure, since they don't even have an HTML for instance.
My server code looks like this:
//*****************//
// Sets up backend //
//*****************//
var app = require('express')();
var server = require('http').Server(app);
var io = require('socket.io')(server);
var express = require('express');
server.listen(8080);
app.use(express.static(__dirname + '/views'));
app.use(express.static(__dirname + '/static'));
var users = [];
//*****************//
// Sends out html //
//*****************//
app.get('/', function(req, res){ // Main page
res.render('index.html');
});
//*************************//
// Handles socket requests //
//*************************//
io.on("connection", handleIO); // Called when user connects
function handleIO(socket){
console.log('Client connected...');
// Bunch of socket.io code I didn't think was necessary to add
}
Anyways, I was wondering if any of you enlightened folks could help a noob out with deploying his first website. If you could either give me a general outline or point me to one I'd really appreciate it as AWS can be pretty intimidating when first starting out. Thanks.

I would say jumping straight into Amazon Web Services would be a mistake as AWS is just an abstraction layer on millions of tasks that you can perform as a Cloud Administrator.
If you do not have the basic concepts of server administration or have not worked in a similar capacity, it can prove to be counter-productive.
Still if you are willing to make the jump, here are the steps I would recommend:
Learn to create an EC2 instance
Setup/install required software on your EC2 instance
Transfer your code to the EC2 instance
Configure/run your application.
If it helps, EC2 is just a VPC with shell access and you can use it through the command line as you normally would a desktop linux.

Boxfuse lets you deploy your Node.js app to AWS effortlessly in literally two steps from within your project directory:
npm-bundle: creates a tgz including your app and all the required node modules (install using: npm install -g npm-bundle)
boxfuse run -env=prod:
Creates a minimal image containing the contents of the tgz bundle as well as the Node.js runtime, the Linux kernel and a bootloader
Pushes that image in the secure Boxfuse Vault
Converts it into an AMI
Creates a new domain name
Provisions an elastic IP or an ELB (depending on the app type you configured)
Creates a security group with the correct ports mapped
Launches a new EC2 instance and ensures it is healthy
Assign the elastic IP to the instance
Boxfuse is based on 3 principles: Immutable Infrastructure, Minimal Images and Blue/Green deployments with zero downtime.
Boxfuse also comes with out-of-the-box support for auto-scaling, relational databases and fast local development and testing on VirtualBox.
We have a tutorial you can follow to get started in 5 minutes: https://boxfuse.com/getstarted/nodejs
Disclaimer: I am the founder and CEO of Boxfuse

start with the AWS docs and get a free account on AWS, make your hand dirty.Make things break things and again fix them, this is how you will learn AWS.
Regarding study material
1. AWS DOCS
2. REinvent videos on youtube
3. Buy a Pluralsite account they have very good course over these.
Start with Ec2 and VPC

Related

Accessing user uploaded images in a deployed MERN app

I am a relatively new developer and I have made a personal blog app, where a user can create a post and upload an image to use as the thumbnail for that post.
So far the backend and frontend work brilliantly and I am able to get the image, store it locally in a folder on my machine, store the file path in MongoDB and then access it and display it in the UI accordingly.
Now that I'm looking to finally deploy my application I have to figure out a way to upload images to an online cloud storage or something, where I can access them from my frontend as well.
Any suggestions on a good service of this kind and if possible, something free, suitable for my small project? Or any suggestions on an alternative way of dealing with this situation?
Any advice will be greatly appreciated!
Thanks in advance!
NOTE: I plan on deploying my app with Heroku, so if you've ever dealt with this issue directly using Heroku, please share your experience.
Yes, I have several apps that do just this! Sign up for a free MongoDB Atlas account and then you can store the data on their servers and point your Express app to the connection URL. Basically, they will give you a URL like this:
mongodb+srv://your-cluster-name:a232dfjoi39034r#atlas-free-cluster-czaoo.mongodb.net/blog-app?retryWrites=true
Which you can then store in a .env file like this:
MONGODB_URL=mongodb+srv://your-cluster-name:a232dfjoi39034r#atlas-free-cluster-czaoo.mongodb.net/blog-app?retryWrites=true
And access from your app like so:
mongoose.connect(process.env.MONGODB_URL, connectionOptions)
.catch((err) => {
console.log('Error on initial DB connection: ', err);
});
You'll need to load the .env files in your app using an npm packages such as dotenv on the development side. For heroku, you can use the heroku-cli
and set any environment variables for your app like so:
heroku config:set MONGODB_URL=mongodb+srv://your-cluster-name:a232dfjoi39034r#atlas-free-cluster-czaoo.mongodb.net/blog-app?retryWrites=true
Note, the development MongoDB URL could be a connection string to a local instance, such as:
MONGODB_URL=mongodb://localhost:27017/my-blog-app
And the one for heroku can be the MongoDB Atlas cluster.
There are a few other config things to do for Node apps, like having a 'start' script in package.json, which for Express apps created with express-generator looks like:
"scripts": {
"start": "node ./bin/www"
But for your case may be different. It should point to whatever file is the entry point for your server.
You'll also need to set the PORT from process.env.PORT, which heroku will set on their end, and on your end you can just default to 3000:
const PORT = process.env.PORT || 3000;
app.listen(PORT);
If you have any questions, feel free to PM me or just ask here and I can elaborate further. It can be intimidating the first few times, but it's actually really easy to deploy stuff to heroku this way once you get the hang of it! You can also check out the heroku docs for deploying Node apps.

Set up a server from within an electron app

I haven't had any success looking for this because I mostly find misleading questions, about people wanting to use data from a server inside of their electron app. That's not my case.
I have a regular app, which uses a server on the internet, just like any other, but we want to make it available for schools without internet (without any or without reliable internet), so what I'm trying to do is to create a version of my server which runs from an electron exe and serves files for the students conected to the wifi (but no the internet) to access. After the process is done "offline", I will sync the data from the electron app itself.
I tried to run a server from express but I didn't have any progress so far. What I tried was to put the exact same code from my node server in my main.js file and had no luck.
I know that's not what electron is supposed to do, if you're positively sure there is no way to do that, please tell me so I can search for another alternative.
A simple approach is to create a cluster where the master process is the Electron Main and the worker process is the server.
Example:
Change the main on package.json to start.js
On start.js write:
const cluster = require('cluster');
if (cluster.isMaster) {
require('./main.js'); // your electron main file
cluster.fork();
} else {
require('./server.js'); // your server code
}

How to separate express server code from Express business logic code?

All the Node.js tutorials that I have followed have put everything in one file. It includes importing of libraries, routing, database connecting and starting of the server, by say, express.js:
var app = require('express');
app.get('/somePath', blah blah);
app.listen(...);
Now, I have 4 node servers behind an Nginx load balancer. It then becomes very difficult to have the source code updated on all the four servers.
Is there a way to keep the source code out of the server creation code in such a way that I can deploy the source code on the servers as one package? The server creation code should not know anything about routing or database connections. It should only be listening to changes in a folder and the moment a new module meta file appears, it starts hosting that web application.
Much like how we deploy a Java code packaged as war by Maven and deployed to the webapp of Tomcat, because Tomcat instantiation is not part of the source code. In node.js it seems server is also part of the source code.
For now, the packaging is not my concern. My concern is how to separate the logic and how do I point all my servers to one source code base?
Node.js or JavaScript for that matter doesn't have a concept like WAR. But what it does have is something similar. To achieve something WAR like, you would essentially bundle the code into one source file using something like webpack. However, this will probably not work with Node.js modules like http (Express uses `http since it likely calls or relies on native V8/C++ functions/libraries.
You could also use Docker and think of the Docker containers as WARs.
Here is what I figured out as a work around:
Keep the servers under a folder say, "server_clusters" and put different node servers there, namely: node1.js, node2.js, node3.js, node4.js, etc (I know, in the real world, the clusters would be different VMs or CPUs altogether but for now, I simply want to separate server creation logic from source code). These files would have this code snippet:
var constants = require('./prop');
var appBasePath = constants.APP_BASE_DIR;
var appFilePath = appBasePath + "/main";
var app = require(appFilePath);
//each server would have just different port number while everything else would remain constant
app.listen(8080, function (req, res) {
console.log("server started up");
});
Create a properties file that would have the path to the source code and export the object. That simple. This is what is used on line#1 in the above code
Create the source directory project wherever you want on the machine and just update its home directory in the constant file above. The source code directory can export one landing file that will provide the express app to the servers to start:
var express = require('express');
var app = express();
module.exports = app;
With this, there are multiple servers that are pointing to the same source code.
Hope this helps to those who are facing the same problem.
Other approaches are welcome.

Multiple nodejs servers or single?

For passed year I've started around 40 independent web apps using nodejs (each is running it's own server with custom port using express + socket.io). What really buggs me is that pm2 processes list have a vertical scroll )))
The question is: is it normal running so much node servers or there is a better way?
There is no issues running multiple node servers, in fact when it comes to micro services architecture the more you break down, the better. There are a lot of pros and cons to this and you need to figure out if the cons affect you system more than you're willing to sacrifice. I assume when you started out you had no idea of following the micro services architecture but since your application is now distributed over 40+ services the following is an article might give you some insight into managing it properly.
https://derickbailey.com/2016/12/12/making-the-quantum-leap-from-node-js-to-microservices/
If they all are independent then you have to create individual server. But if they are talking to each other (I think they are talking to each other by socket.io) then you can use RPC call. Their is some lib like grpc(Google) and tchannel(Uber). I hope they can solve your problem.
You can start several express and socket.io instances in one NodeJS process, if you wish, like this (based on Express Hello world example):
const express = require('express');
// First app
const app1 = express();
app1.get('/', function (req, res) {
res.send('Hello World!');
});
app1.listen(3000, function () {
console.log('Example app1 listening on port 3000!');
});
// Second app
const app2 = express();
app2.get('/', function (req, res) {
res.send('Hello World 2!');
});
app2.listen(3001, function () {
console.log('Example app2 listening on port 3001!');
});
But there are several aspects to consider:
Performance - if each app is handled by a separate NodeJS process, then apparently each of them have more CPU time and high load on one of them will not affect others.
Logging - if several apps are handled by one NodeJS process you will need to distinguish which of them outputs somehow. Otherwise output will be chaotic.
Аault tolerance - if one NodeJS process handles several apps, then in case of a critical failure in one app all of them will crush simultaneously.
Security - probably some security issues in one app could affect other apps handled by one NodeJS process.

Two NodeJS apps in a single gear on Openshift or a single Heroku dyno

Looking for a way to run two NodeJS apps on a single gear on Openshift Online (Premium) or the equivalent with Heroku and a dyno. Each app will live in it's own folder in the file system and have its own server.js file, listening on a different port.
Each app will have its own domain.
Each app can have its own git repo, or both apps can be in the same repo (different folder) if separate repos are not possible.
An alternate (and simple) solution would be to run each app on its own gear/dyno, but these apps are low traffic and do not justify the cost of running them separately.
Note: on Openshift, the apps run the following cartridges: nodejs-0.10 mongodb-2.4
I don't think so that would be possible because the PORT to run node app is fixed by these PaaS platforms, so, you cant run two apps on same port.
A workaround can be using vHost and cluster(I haven't tried it myself)
Write a third repo with server.js and this third repo would be bound as main repo on Heroku/Openshift
Write some shell script that runs on Post install/download of the third repo, and it downloads the other two repositories on your remote machine
And, then in the third repo use the code below
// server.js
var cluster = require('cluster');
var express = require('express');
var app = express ();
if(cluster.isMaster) {
cluster.fork ();
}
else {
app
.use( express.vhost('www.site1.com'), require(PATH_TO_FIRST_REPO_SERVER.JS).app ) )
.use( express.vhost('www.site2.com'), require(PATH_TO_SECOND_REPO_SERVER.JS).app ) )
.listen(8080);
}

Resources