AWS Linux (Ubuntu) hosted application is not accessible from public ip - node.js

I hosted a nodejs(express hello world app) application on AWS Linux(Ubuntu 16.04) on free-tier. When i do wget http://localhost:8080 it runs successfully and saved the output in index.html file.
But when i do the same thing with the public ip (wget http://35.154.40.189:8080) of my instance, it says
Connecting to 35.154.40.189:8080... failed: No route to host.
I also used the steps given in http://www.lauradhamilton.com/how-to-set-up-a-nodejs-web-server-on-amazon-ec2 to forward all ipv4 traffic to my application but it doesn't work.
I also enabled port 8080 from aws console.
netstat -atn says
netstate -ntlp says
I tried everything which i get on internet but unable to resolve the issue. Now i'm too much frustrated. Any help would be highly appreciable.

MAke Your Instance in AWS first
Enable inbound rule as u mention in picutre
Enable user group after ssh connection with AWS ubuntu instance
once instance start running then Install node properly
sudu apt-get update
sudo apt-get install libssl-dev g++ make
download source code of node from web node.tar.gz wih command wget link
https://nodejs.org/dist/v6.9.1/node-v6.9.1.tar.gz
tar -xvf node -v0.10.32.tar.gz
now goto node after unzip .gz
./configure && make && sudo make && sudo make install
boom your node server is ready on new AWS instances
or watch this https://www.youtube.com/watch?v=WxhFq64FQzA&t=1693s

Related

Port problems to access a service inside a container

I'm posting for a friend. He asked my help and we couldn't find out what's going on.
My situation is: my application works perfectly on Ubuntu 18.04 when it’s not inside a container, but the customer required the use of containers so I created a Dockerfile so it could be started by a Docker container.
Here’s the contente of my Dockerfile
FROM node:8.9.4
ENV HOME=/home/backend
RUN apt-get update
RUN apt-get install -y build-essential libssl-dev
RUN apt-get install -y npm
COPY . $HOME/
WORKDIR $HOME/
RUN npm rebuild node-sass
RUN npm install --global babel-cli
USER root
EXPOSE 6543
CMD ["babel-node", "index.js"]
After building the image, I execute the following Docker run command:
sudo docker run --name backend-api -p 6543:6543 -d backend/backendapi1.0
Taking a look at the log output, I can conclude that the application Works properly:
I’ve created a rule in my nginx to redirect from port 90 to 6543 (before using containers it used to work)
server {
listen 90;
listen [::]:90;
access_log /var/log/nginx/reverse-access.log;
error_log /var/log/nginx/reverse-error.log;
location / {
proxy_pass http://localhost:6543;
}
}
P.S.: i’ve tried to change from localhost to the container’s IP and it doesn’t work as well.
The fun fact is that when i try na internal telnet on 6543 it accepts the connection and closes it immediately.
P.S.: all ports are open on the firewall.
The application Works normally outside the container (using port 6543 and redirecting in nginx)
I’d appreciate if someone could help us to find out the reason why it’s happening. We don't have much experience creating containers.
Thanks a lot!
Edit: it's an AWS VM, but this is the return when we run the command curl:
We found the solution!!
It was an internar container router problem...
The following Docker run command solved the problem:
sudo docker run --name my_container_name --network="host" -e MONGODB=my_container_ip -p 6543:6543 my_dockerhub_image_name
Thanks a lot!!

Error connecting to docker container

I installed a service on a remote Linux computer using docker. I used the following commands
git clone https://github.com/OpenVidu/openvidu-tutorials.git
npm install -g http-server
http-server openvidu-tutorials/openvidu-getaroom/web
docker run -p 4443:4443 --rm -e openvidu.secret=MY_SECRET -e openvidu.publicurl=https://187.84.228.66:4443 openvidu/openvidu-server-kms
But when I try do connect at the first time, I received the follow error menssage:
"ERR_EMPTY_RESPONSE", and sometime "ERR_CONNECTION_CLOSED"
I used the following diagnostic command in Linux computer “docker ps” and received the following response:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
21b0620266cd openvidu/openvidu-server-kms:latest "/usr/bin/supervisor 5 days ago >Up 5 days 8443/tcp, 8888/tcp, 0.0.0.0:4443->4443/tcp, 9091/tcp sick_ritchie
What is wrong? How can I solve this?
I'm not sure sure what causes this, but when I had the same problem (with different http server) I changed listening on particular ip address (which was 127.0.0.1 in my case) to listening on all interfaces - 0.0.0.0.
I mean http server configuration, not Docker configuration.

Connecting app to AWS EC2 instance

I'm pretty new to DevOps and I'm trying to set up my Node.js app on a AWS server instance. Steps I've taken:
Set up Elastic IP
Launched EC2 instance with Ubuntu server
Connected IP to instance
Allowed incoming connections on port 3000
SSH'd into the server with a .pem file
Now I'm at the point where I need to get my files uploaded to the server. I've used FileZilla (and like it) in the past to upload files but the initial part was already set up. When I set up the site on FileZilla there is no /var/www folder on the remote site.
Don't know how to connect these dots.
Also not sure what I need to run once I successfully upload the files. I imagine npm install when I'm ssh'd into the server? Most of the tutorials out there only go through the basic instance setup.
Thanks!
You don't need to have /var/www. Also, it's better that you use a version control and a remote repository like Github and then SSH to your EC2 and then clone your repository there.
Then cd into your repo and run npm install and then start your app.
And check.
Once you connect to the EC2 instance then clone your code in there. It not mandatory to be in /var/www/html but, it's best practice to keep it there. Once you clone npm install into your project home directory so all the required packages get installed. Then for running your node application in production you have to run it on service as pm2, supervisor, forever, passenger, etc. You can use any of these services and configured appropriately to run your application on desired port. As with pm2, you can follow this guide, install pm2 Then you can run with the following command w.r.t. your environment, like I want to run my application on port 5555 for production
$ PORT=5555 pm2 start app.js --name API --env production -f
Check the status using pm2 list Now, your application is running on http://server-ip:5555/ But, you won't be typing port number every-time. So, you need to configure the web server in front of your application like apache or nginx which will forward all request to your application running port. You could find the best guide to their home page. Then your application is available at http://server-ip/ You can follow this for single configuration of multiple node apps
Hope this helps.

How to give GUI to remote server and access it from my local machine in ubuntu 14.04 lts

I have one ubuntu server located in one city(remotely), i want to give dummy display/GUI to that server and access it from my local ubuntu machine, how can i create this, please suggest me if there is any way to create a dummy display to that server from my local machine and can i access it like my own local machine.
Connect via ssh to your Linux server and do the following in cli:
sudo passwd root (put password for your username that you want to connect from)
sudo apt-get install ubuntu-desktop (to install your GUI)
sudo apt-get install xrdp (to install the middleware to connect through)
open port on your cloud/host portal for remote connection (port 3389)
Do your remote desktop to your vm dns using that port and use ur created username/password to go through.
You can use ssh service, open a terminal and write
# sudo root ssh#ip_address_of_server
so if the ip address of server is : 192.168.1.22
the command line will be :
# sudo root ssh#192.168.1.22
You do not forget to install ssh service

hosting nodejs application in EC2

I'm interested in hosting nodejs applications in a cloud and I'm looking for a free cloud hosting for my purpose. I've found that Amazon has one but I have the following question: Are there any tutorials around how I can set up and run nodejs application in Amazon EC2?
EDIT: Can you provide any good hostings for nodejs (except heroku)?
I've been using Node.js with Amazon EC2 for a while and was quite happy with both of them. For the moment AWS seems to be the cheapest and the most robust cloud provider, so picking up Amazon wouldn't be a mistake. There's nothing special about running Node.js in the cloud - you work with it like if it were your own PC. Below are some general steps to follow for the simplest Node.js application running on EC2 Ubuntu server:
Create Amazon EC2 account.
From AWS console start t1.micro instance with any Ubuntu AMI (example).
Login via SSH to your instance.
Install node.js: sudo apt-get install nodejs
Create new file test_server.js with the following content:
require("http").createServer(function(request, response){
response.writeHeader(200, {"Content-Type": "text/plain"});
response.write("Hello World!");
response.end();
}).listen(8080);
Start the server: node test_server.js
Check it's working from another console: curl http://localhost:8080
Check out these tutorials (updated for 2021)
How to Deploy a Node.js Application On AWS EC2 Server
How to Deploy a Node.js application in AWS EC2
How To Deploy Your Node.js App On AWS With NGINX And SSL
Based on this tutorial, here's an updated step by step:
1) Make an account on Amazon Web Services.
2) Create an EC2 instance; I chose Ubuntu micro.
3) Configure Security Group (name it "Node") and add ports:
HTTP (80), HTTPS (443), and a custom TCP port for your Node app (e.g. 3000)
4) Launch the instance and save the pem file (private key), e.g. "node.pem".
5) On Windows - install Cygwin + OpenSSH package. it is also recommended to install WinScp to have "explorer like" access to the linux.
6) Open Cygwin Terminal as Administrator, and set correct permissions to "node.pem" file:
chown :Users node.pem
chmod 400 node.pem
7) Find your EC2 instance public DNS name in the EC2 dasboard, and connect to it with SSH:
ssh -i node.pem ubuntu#{your EC2 public DNS name}
8) Update Ubuntu and install NodeJS:
sudo apt-get update
curl -sL https://deb.nodesource.com/setup_7.x | sudo -E bash -
sudo apt-get install -y nodejs
sudo apt-get install -y build-essential
9) Copy your NodeJS application into the EC2 instance (via Cygwin, or Winscp).
10) Install all of your Node app required modules:
cd /home/ubuntu/My_Node_App
npm install --save
11) Re-route ports with IPtables so that your app can be accessed on default http port 80:
sudo iptables -t nat -A PREROUTING -p tcp --dport 80 -j REDIRECT --to-ports 3000
To view the iptables routing entries, run:
sudo iptables -t nat -L
If you need to remove routing entry (first line), run:
sudo iptables -t nat -D PREROUTING 1
12) Run your app as a background process:
sudo nohup node app.js &
To kill your app process:
ps -ef | grep app.js
sudo kill {proccess id number}
My blog post on how to deploy Node-based apps on EC2: http://devblog.daniel.gs/2014/01/deploying-node-apps-on-aws-ec2-with.html
Explaining:
Deploying Node apps from your github repo (private+public)
Automating the deployment process using scripts
Reverse proxy using Nginx
and using Forever utility.
Hope this helps.
There are quite some hosting solutions for Node.js available, here are a couple of these:
Joyent
Joyent is the corporate sponsor and trademark owner of Node.js and provides an appealing alternative to Amazon EC2 for many things, not the least Node.js hosting of course, see the Joyent's Node.js Development Environment (please check the Node.js™ Development SmartMachine Terms of Service though).
Apparently they are just restructuring this development offering though:
For the past year, Joyent Cloud has provided a free development
sandbox for users of Node.js. Over time, the community has made it
clear that they want more tools and more capacity. To this end, we are
excited to announce a partnership with Nodejitsu to provide both of
these in a world-class Node.js development environment with
Nodejitsu's development and management tools running on Joyent Cloud's
Infrastructure-as-a-Service platform. The new service will launch very
shortly.
Accordingly, it is not entirely clear yet how the pricing options for a production hosting of a Node.js solution will end up, but given Joyent's competitive pricing, I'd expect a similar option at least.
Cloud Foundry
The Cloud Foundry Open Platform as a Service Project support Node.js as well, amongst many other frameworks (which makes the platform so exciting), The platform is getting quite some traction recently and is meanwhile used by several solution Platform as a service (PaaS) providers as their backend accordingly - amongst these are (in no particular order and not necessarily complete):
AppFog - Simple PaaS for Java, Node, .Net, Ruby, PHP, MySQL, Mongo, PostgreSQL, and more...
Freedom to move between IaaS at will with the easiest pricing in the cloud.
Cloud Foundry (VMware) (corporate sponsor of Cloud Foundry) - Deploy and scale applications in seconds, without locking yourself into a single cloud.
Iron Foundry - Iron Foundry is an open source project that extends Cloud Foundry™ to the .NET ecosystem by providing services, installers, and developer tools.
Most of these are in beta still and the pricing isn't settled yet, but given the competition I'd expect quite some interesting options here over time.
The easiest way to run node.js for free on EC2 is IMHO on Heroku.
check out this complete tutorial here.
This tutorial shows how to install Node.js on EC2 and configure HTTP ports and nginx for port forwarding as well as using supervisor to run the Node.js forever as it normally stops on closing your SSH console session.
I just went through the Heroku sign-up and application tutorial. Could not have been easier. What a delightful experience...
...right up to the point where you can't have a MongoDB instance as a free gear. The minimum cost (other than a free trial month) is $18/month per GB of storage.
Honestly, the better choice then is Openshift. It's got three free gears which is enough for a lot of beginner stuff like what I'm doing. Both Heroku and Openshift are within Amazon's space but their customer interface is different. I thought Heroku's was easier for beginners to get started but as I mentioned, there's no free lunch on the database side of things.

Resources