Node.js Multiple SSL Certificates - node.js

Is it possible to have Node.js use multiple SSL certificates? I am currently using one certificate but had a new certificate issued that matches other domains.
Since my server is behind a load balancer, there are two ways to get to it and I'd like to match them. Is there a way to use two certificates, instead of creating one with both matches?

See this answer for hosting multiple domains on a single https server
In short, you can use the SNI callback from the https server.
SNI stands for Server Name Identification and it is implemented by all modern browsers.
How it works:
The browser sends the hostname unencrypted (if it supports SNI). The rest of the request is encrypted by the certificate. The HTTPs module can then let you decide which SSL certificate is going to be used to decrypt the connection.
SNI Notes:
SNI is used by AWS Cloudfront and other services, if you require a secure connection
SNI requires a modern browser for it to work.. However given that AWS uses it gives me confidence in using it too.
Depending on how you implement it, it may slow down the request.
It may be better to put a nginx proxy in front of this.
Your connection then travels like this: Client -> (HTTPS) -> NGINX -> (HTTP) -> Node
Nginx could also serve static files, which may optimise your site.
I hope this helps you. I am posting this for documentation purposes.

Related

Is HTTPS behind reverse proxy needed?

I have an API server running behind an nginx reverse proxy. It is important to have all requests to my API server be secured via TLS since it handles sensitive data.
I've setup nginx to work with TLS (LetsEncrypt) so that seems to be okay. However, requests from nginx to my API server are still insecure http requests (this is all happening across docker containers, by the way).
Is it a best practice to also setup https between the reverse proxy and the API server? If so, how would I go about doing that without over-engineering it?
It all comes down to how secure or paranoid you'd like your implementation to be. It may also depend on the type of data you're playing with. For instance: I'd definitely do this for credit card numbers or other sensitive information.
As the comments have already stated, you would typically terminate SSL connections at the front facing webserver, assuming the API backend is also inside your LAN, which you trust and control. If you want to go that extra mile, you could also set up SSL on the API backend. Details of how to do that depend on the software you're using on your backend.
If you do decide to implement SSL on the API backend, the setup would be similar to what you did to setup Nginx with SSL on the frontend, with the main difference being you don't need to use a public certificate on the backend. It can be self-signed, since no one else besides your web server will be talking to it. Then it's just a matter of fixing all the URIs in your code to use HTTPS.

require('https') vs require('tls')

I'm trying to create a very secure connection between client and server using Node.js, Express.js and TLS (1.2).
I think my problem is in understanding what TLS actually is - meaning what is being exchanged, when and how by who.
Anyhow, I'm searching the internet like a nutter (crazy person) to try and figure out following:
what does var tls = require('tls'); invoke?
what does var https = require('https'); invoke?
I can get tls working when using another node as a client, but in this case the client will be a user in a browser. Can I use both for a browser or only https??
Thanks
Let's indeed start with what TLS is.
TLS is a way to provide secure connections between a client and a server. It does this by providing a safe way for clients and servers to exchange keys so they can then use public-key cryptography to secure their transmission. The exact mechanism is found here, but it's really not important for this answer.
Now, what is https? Well first, let's talk about HTTP. HTTP is a protocol that defines how web servers and clients talk and exchange web pages or data. Basically, it includes a request from a client and the server responds with a numerical message, headers, and (optionally) a body. If you're familiar with writing web pages, this is obvious.
So now, finally, what is HTTPS? HTTPS is a version of HTTP using TLS to secure data. This means that clients and servers can use the same protocol they're used to, wrapped in encryption.
Now, let's talk about these in node.js.
When you use require('tls'), you're only using the encryption layer, without defining the protocol. This will work fine for anything that doesn't expect an exact protocol, such as your other node.js client.
When you use require('https'), you're specifically using HTTP over TLS. The https module is actually a subclass of the tls module! (Oops, actually, the https.Server is a subclass of tls.Server) This means that whenever you're using the https module, you're also using the tls one.
Now, the final question: What does the browser want? If you've been following everything I've said, you can see that the browser wants https. In fact, it's likely that most of the webpages you've visited today has been over https.

Create a Reverse Proxy in NodeJS that can handle multiple secure domains

I'm trying to create a reverse proxy in NodeJS. But I keep running the issue that in that I can only serve one one set of cert/key pair on the same port(443), even though I want to serve multiple domains. I have done the research and keep running into teh same road block:
A node script that can serve multiple domains secure domain from non-secure local source (http local accessed and served https public)
Let me dynamically server SSL certificates via domain header
Example:
https ://www.someplace.com:443 will pull from http ://thisipaddress:8000 and use the cert and key files for www.someplace.com
https ://www.anotherplace.com:443 will pull from http ://thisipaddress:8080 and use the cert and key files for www.anotherplace.com
ect.
I have looked at using NodeJS's https.createServer(options, [requestListener])
But this method supports just one cert/key pair per port
I can't find a way to dynamically switch certs based on domain header
I can't ask my people to use custom https ports
And I'll run into browse SSL certificate error if I serve the same SSL certificate for multiple domain names, even if it is secure
I looked at node-http-proxy but as far as I can see it has the same limitations
I looked into Apache mod-proxy and nginx but I would rather have something I have more direct control of
If anyone can show me an example of serving multiple secure domains each with their own certificate from the same port number (443) using NodeJS and either https.createServer or node-http-proxy I would be indebted to you.
Let me dynamically server SSL certificates via domain header
There is no domain header so I guess you mean the Host header in the HTTP request.
But, this will not work because
HTTPS is HTTP encapsulated inside SSL
therefore you first have to do your SSL layer (e.g. SSL handshake, which requires the certificates), then comes the HTTP layer
but the Host header is inside the HTTP layer :(
In former times you would need to have a single IP address for each SSL certificate. Current browsers do support SNI (server name indication), which sends the expected target host already inside the SSL layer. It looks like node.js does support this, look for SNICallback.
But, beware that there are still enough libraries out there, which either don't support SNI on the client side at all or where one needs to use it explicitly. But, as long you only want to support browsers this should be ok.
Redbird actually does this very gracefully and not too hard to configure either.
https://github.com/OptimalBits/redbird
Here is the solution you might be looking at,
I found it very useful for my implementation
though you will need to do huge customization to handle domains
node-http-rev proxy:
https://github.com/nodejitsu/node-http-proxy
Bouncy is a good library to do this and has an example of what you are needing.
As Steffen Ullrich says it will depend on the browser support for it
How about creating the SSL servers on different ports and using node-http-proxy as a server on 443 to relay the request based on domain.
You stated you don't want to use nginx for that, and I don't understand why. You can just setup multiple locations for your nginx. Have each of them listen to different hostnames and all on port 443.
Give all of them a proxypass to your nodejs server. To my understanding, that serves all of your requirements and is state of the art.

in node.js, using ssl with more than one domain name with one ip address

used to Apache in Linux where each domain name using ssl requires its own ip address.
is this still true if using node.js and not using Apache at all?
The same limitations apply in node.js as in Apache -- they're nothing to do with the particular server software you're using, they're inherent in the http and TLS/SSL protocols.
Having said that, there are two ways to run SSL for multiple domains from a single IP address. I don't know the status of node.js support for either of these, but it shouldn't matter for the first alternative.
First, you can get a single SSL certificate that covers all of the domain names you want to use -- either a wildcard if they're all subdomains of the same domain or one that uses Subject Alternative Names (SAN) if they're not. Note that SAN is not supported by some older web browsers, especially on some smartphones.
Second, you can use Server Name Indication (SNI) to configure multiple SSL certificates, as it extends the SSL protocol to make the hostname available to the server before it's done the key exchange. Browser support for SNI is not as good as for SAN, and in particular it doesn't work with any Internet Explorer version on Windows XP.
This link shows how to do it with nginx using the SNI method.
https://www.digitalocean.com/community/articles/how-to-set-up-multiple-ssl-certificates-on-one-ip-with-nginx-on-ubuntu-12-04
You might as well let nginx do the https and file serving and have it reverse proxy into node.js for the api work, as shown here:
https://stackoverflow.com/a/15008873/151312

Can a proxy server cache SSL GETs? If not, would response body encryption suffice?

Can a (||any) proxy server cache content that is requested by a client over https? As the proxy server can't see the querystring, or the http headers, I reckon they can't.
I'm considering a desktop application, run by a number of people behind their companies proxy. This application may access services across the internet and I'd like to take advantage of the in-built internet caching infrastructure for 'reads'. If the caching proxy servers can't cache SSL delivered content, would simply encrypting the content of a response be a viable option?
I am considering all GET requests that we wish to be cachable be requested over http with the body encrypted using asymmetric encryption, where each client has the decryption key. Anytime we wish to perform a GET that is not cachable, or a POST operation, it will be performed over SSL.
The comment by Rory that the proxy would have to use a self-signed cert if not stricltly true.
The proxy could be implemented to generate a new cert for each new SSL host it is asked to deal with and sign it with a common root cert. In the OP's scenario of a corportate environment the common signing cert can rather easily be installed as a trusted CA on the client machines and they will gladly accept these "faked" SSL certs for the traffic being proxied as there will be no hostname mismatch.
In fact this is exactly how software such as the Charles Web Debugging Proxy allow for inspection of SSL traffic without causing security errors in the browser, etc.
No, it's not possible to cache https directly. The whole communication between the client and the server is encrypted. A proxy sits between the server and the client, in order to cache it, you need to be able to read it, ie decrypt the encryption.
You can do something to cache it. You basically do the SSL on your proxy, intercepting the SSL sent to the client. Basically the data is encrypted between the client and your proxy, it's decrypted, read and cached, and the data is encrypted and sent on the server. The reply from the server is likewise descrypted, read and encrypted. I'm not sure how you do this on major proxy software (like squid), but it is possible.
The only problem with this approach is that the proxy will have to use a self signed cert to encrypt it to the client. The client will be able to tell that a proxy in the middle has read the data, since the certificate will not be from the original site.
I think you should just use SSL and rely on an HTTP client library that does caching (Ex: WinInet on windows). It's hard to imagine that the benefits of enterprise wide caching is worth the pain of writing a custom security encryption scheme or certificate fun on the proxy. Worse, on the encryption scheme you mention, doing asymmetric ciphers on the entity body sounds like a huge perf hit on the server side of your application; there is a reason that SSL uses symmetric ciphers for the actual payload of the connection.
How about setting up a server cache on the application server behind the component that encrypts https responses? This can be useful if you have a reverse-proxy setup.
I am thinking of something like this:
application server <---> Squid or Varnish (cache) <---> Apache (performs SSL encryption)

Resources