Yeoman generator-angular-fullstack enabling ssl - node.js

Trying to be able to run a SSL server using the generator-angular-fullstack https://github.com/DaftMonk/generator-angular-fullstack.
However when I look at all the examples for enabling SSL, when I comb through the code it doesn't seem to initialize the server the same way as the NodeJS documentation explains to:
var options = {
key: fs.readFileSync('test/fixtures/keys/agent2-key.pem'),
cert: fs.readFileSync('test/fixtures/keys/agent2-cert.pem')
};
// Create a service (the app object is just a callback).
var app = express();
// Create an HTTP service.
http.createServer(app).listen(80);
// Create an HTTPS service identical to the HTTP service.
https.createServer(options, app).listen(443);
Has anyone had any success in doing it? Outside of that this generator seems to be incredible and easy to use.

Yes, the code above is how you run your app on 443, using the key and cert you have specified above. This should allow you to communicate with your app over HTTPS, assuming you have those keys (and of course you'll get warnings in the browser if they're self signed).
But yes, that works, and is how it's done. I've found that most people like to keep the Node app running on HTTP and instead use a web server (such as nginx) to deal with SSL. The communication from the web server to the Node app is then over HTTP. This helps keep the Node app easy to run in a development/test environment, and then in production you have the security of SSL.

Related

Added SSL to client side server but still not performing handshake with backend server nodejs

I am trying to implement SSL on my nodejs project. Currently, my servers are split between a client side server running on localhost port 443 and a backend server running on localhost port 5000. I have already added a self-signed SSL certificate by openSSL to my client side server as shown below.
Now here's my issue. When I send a post request to login, from what I understand, a handshake is suppose to happen between the server and the client to make a secure connection. However, that's not the case. When I used Wireshark the intercept the packets, there is no handshake happening in the process.
I am currently not sure on how to proceed because I have limited knowledge on this kind of security topics. Do I need to sign a new key and cert and add it to my backend server? Or am I doing everything wrong? If so, can I get a source or guide on how to properly create one for a nodejs server?
you have many options here for securing your backend server :
first, you can use Nginx reverse proxy server and you can add ssl/tls logic to it. nginx will handle this stuff for you.
second, you can use [https][1] package directly and pass your SSL certificate and key to it :
const https = require('https');
const fs = require('fs');
const options = {
key: fs.readFileSync('test/fixtures/keys/agent2-key.pem'),
cert: fs.readFileSync('test/fixtures/keys/agent2-cert.pem')
};
https.createServer(options, (req, res) => {
res.writeHead(200);
res.end('hello world\n');
}).listen(8000);
remember that the domain name your are trying to access must be set in your host ip.
[1]: https://nodejs.org/api/https.html

socket.io + azure app services 503 error even though websockets is ENABLED

I am trying to set up socket.io with my Node.js server on Azure app services. It works perfectly fine in my local server. However, I can't seem to get it to work on Azure.
I have enabled web sockets in my Azure App Services -> configuration -> general settings. However, this does not work.
I have followed instructions in this stackoverflow post: Socket IO net::ERR_CONNECTION_REFUSED but it hasn't worked for me.
My server side code:
...
const port = process.env.PORT || 5000;
const app: Application = express();
const server = require("http").Server(app);
const io = require("socket.io")(server);
...
io.of("/chat").on("connection", async function(socket: any) {
console.log('hi')
});
...
server.listen(port, () => console.log(`listening on port ${port} `));
My CORS settings have also been set up appropriately, HTTP requests to my server work fine, as well as requests that require a credentials.
Any help would be appreciated. Thanks
EDIT (MORE INFO):
I am using in-house authentication. Because of this, I need to set the header 'Access-Control-Allow-Credentials' to TRUE. My express app was setting this to true via the cors npm module, however no matter what I did I could not get HTTP requests to work when doing authentication requests.
I was able to solve this by doing:
az resource update --name web --resource-group <myResourceGroupName> --namespace Microsoft.Web --resource-type config --parent sites/<site-name> --set properties.cors.supportCredentials=“true” --api-version 2015-06-01
which i found here: (https://learn.microsoft.com/bs-latn-ba/azure/app-service/app-service-web-tutorial-rest-api)
This allowed HTTP requests with credentials to work, as well as any other HTTP requests. It is important to note that in my Azure settings, all my CORS settings are still blank in order to allow my express app to handle CORS.
Not sure if this is related...
EDIT2:
when I try to connect from my client with
const socketIo = io(socketUrl + "/chat", {
// transports: ["websocket"]
upgrade: false,
})
everything works fine. But if i uncomment transports: ["websocket"], connection will fail. It's got to have something to do with the websocket settings in Azure
I would suggest you to provide a minimal working example (server and a client page) it would simplify things.
Here are a couple things to check:
Disable perMessageDeflate in your server-side Node.js code
for example like this:
const io = require('socket.io')(server,{
perMessageDeflate :false
});
Azure Web Apps only listens on ports 80 & 443 which is a common issue
It doesn't look like a case here, but it happens so often, so I will leave it here to help others. According to the doc:
Q: Can I expose more than one port on my custom container image?
A: We don't support exposing more than one port.
So, if this is a case - change the port to either of them and your app will work fine.
I hope it helps! 🙂
So I made a barebones example to test the socket connection... and it worked.
Then, I tried my app again.. and it worked. Dunno what happened but everything is working now...

Is it necessary to use HTTPS with express while HTTPS is already configured for my Elastic Beanstalk environment?

I'd like to make sure any communications with my web app are secured.
It's the first web app I develop and I'm really new to this backend/infrastructure world so my question might sound a bit silly.
My app is written in Node.js and I use Express:
var express = require('express'),
bodyParser = require('body-parser'),
methodOverride = require('method-override'),
errorHandler = require('errorhandler'),
jsdom = require('jsdom'),
http = require('http'),
...
var server = http.createServer(app).listen(app.get('port'), function () {
// Allow prompt in Node after launching
repl = require("repl")
repl.start("> ")
});
This app runs on a AWS's EC2 instance (don't know if the wording is correct) and any communication with the app is secured with HTTPS (I can do API calls to https://my.api.com/get/results, for instance).
Everything works really fine so far but I'm wondering if all of this is safe.
As you may have noticed, I am not using HTTPS with the express server:
http = require('http')
The thing is, as far as I understand, the express server still is "behind" a HTTPS secured portal in my case.
Better asking before releasing unsecured stuff..
If you have a reverse proxy that handles HTTPS for you then there is usually no need to use HTTPS in your Node application, especially if the network that those two communicate over is secure - like a loopback interface or an internal network in your data center. If the reverse proxy and your Node app communicate over the public Internet then you need to use HTTPS for that traffic as well.
See those answers for more info:
reverse proxy using ngix and ssl implementation on express failed
SSL With node / IIS
How to run nodejs server over 443 ensuring nginx doesnt stop working
Configuring HTTPS for Express and Nginx

OpenShift Invalid certificate | SSL on NodeJS app

Here's what I'm working with:
NodeJS/Express app
OpenShift environment (Enterprise account)
Works over HTTP
Certificate trust error over HTTPS
Using default wildcard certificate provided by OpenShift
Begins working if I go manually accept the exception the browsers are raising
Latest Express
Server.js looks something like:
var express = require("express"),
app = express(),
IP = process.env.OPENSHIFT_NODEJS_IP || "127.0.0.1",
PORT = process.env.OPENSHIFT_NODEJS_PORT || 8888; // its 8080 on openshift. i use 8888 only on my local environment for irrelevant reasons
// we sometimes need special endpoints that arent files
app.get("/something-special", function(req, res) {
res.send("something special");
});
// but typically it's static files
app.use(express.static(__dirname + "/public"));
// go!
app.listen(PORT, IP);
When I go to https://myserver/file.js (which lives in /public/file.js), I get an error saying the certificate is not trusted.
I dont much understand certificates, and I barely know Node. This is a learning project so I'm trying to work through all of the issues I come across without changing course.
I've tried everything I can think of, including:
app.enable('trust store') recommended on a different SO
simplifying my Node app and using req.secure to force HTTPS
You might try visiting your app using the https://appname-yourdomainname.rhcloud.com/ version of the URL. The underlying digital certificate is *.rhcloud.com and was issued by "Geotrust SSL CA" for what it's worth. If you do it this way you don't get certificate-related errors because they applied a wildcard-based cert to the servers.
I'm not sure that the free version of the hosting allows for private SSLs to be provided/bound... Yeah, you need Bronze or better to allow a private SSL for your application. Bummer
More than likely what is happening is that you are trying to use the *.rhcloud.com wildcard ssl certificate with your custom domain, and that won't work. OpenShift supplies you with an ssl certificate that matches your app-domain.rhcloud.com address. if you want to use SSL correctly with your custom domain, then you need to acquire (or purchase) a custom ssl certificate for your domain name. You can get one at lots of companies online, or you can get a free on here: https://www.startssl.com
Also, the SSL is terminated on the proxy, before it gets to your gear. Check out this developer center article for more information about how it all works: https://developers.openshift.com/en/managing-port-binding-routing.html

Running multiple sites on node.js

I'm planning to do three sites using node.js. I have got some common templates among the sites. Should I run all three sites on single node.js instance?
I'm aware of 'vhost' middleware that allows you to run multiple domains on single http server. Is there any better option to do this?
I've also got some static html templates and not sure how to deal with these in node.js?
Finally I would like to know hosting options for this kind of setup?
I myself just had to do this exact same thing. What you want to do is use some sort of reverse proxy.
The one I use is here: https://github.com/nodejitsu/node-http-proxy
Simply install the proxy package: npm install http-proxy
What I do is have the proxy running on the server on port 80. I set the DNS up on each domain to point to this server.
Each application is running on the same server (im using screens).
For example:
MySiteApplication1 - 3001
MySiteApplication2 - 3002
MySiteApplication3 - 3003
Then your proxy server file would look like this
var httpProxy = require('http-proxy');
var server = httpProxy.createServer({
router: {
'mysite1.com': 'localhost:3001',
'mysite2.com': 'localhost:3002',
'mysite3.com': 'localhost:3003'
}
});
server.listen 80

Resources