Proxy subdomain of one domain the subdomain of another domain - node.js

I want to setup a basic proxy like the one browser-sync has, for example I want to go to web1.domain1.xyz and it should proxy to web1.domain.xyz (so it's anything.domain1.xyz to anything.domain.xyz)
I got this part working already :
// HTTP Proxy
var simpleHttp = require("http");
var simpleHttpProxy = require("http-proxy");
var simpleProxy = simpleHttpProxy.createServer();
simpleHttp.createServer(function(req,res){
var target = 'http://'+req.headers.host.replace(/domain1.xyz|domain2.xyz|domain3.xyz/gi,'domain.xyz');
simpleProxy.web(req, res, {
rewriteRules: true,
xfwd: true,
toProxy: true,
changeOrigin: true,
hostRewrite: true,
autoRewrite: true,
protocolRewrite: true,
target: target
});
}).listen(4000);
I would also like to be able to point multiple domains at it, for example domain1.com domain2.com domain3.com.
However it doesn't replace the links, I don't know how browser-sync does this, I've been studying https://github.com/BrowserSync/browser-sync/blob/master/lib/server/proxy-server.js and https://github.com/BrowserSync/browser-sync/blob/master/lib/server/proxy-utils.js
I see that they use a custom function to replace the links, however I didn't succeed in implementing their logic.

I'm not sure how to do what you're asking for with http-proxy but I'm almost certain it can be done with express-vhost. There's even a suitable example in the README. I've always found express-vhost to be very light and functional.
var connect = require('connect')
var serveStatic = require('serve-static')
var vhost = require('vhost')
var mailapp = connect()
// add middlewares to mailapp for mail.example.com
// create app to serve static files on subdomain
var staticapp = connect()
staticapp.use(serveStatic('public'))
// create main app
var app = connect()
// add vhost routing to main app for mail
app.use(vhost('mail.example.com', mailapp))
// route static assets for "assets-*" subdomain to get
// around max host connections limit on browsers
app.use(vhost('assets-*.example.com', staticapp))
// add middlewares and main usage to app
app.listen(3000)

Edit: i thought you had a working example. Didn't understand the question correctly :-/ read further down:
You can chain the replace calls
// HTTP Proxy
var simpleHttp = require("http");
var simpleHttpProxy = require("http-proxy");
var simpleProxy = simpleHttpProxy.createServer();
var domains = ['domain1.xyz','domain2.xyz','domain3.xyz'];
simpleHttp.createServer(function(req,res){
// Calling replace for every entry in domains
var domain = req.headers.host;
for (var i in domains) {
domain = domain.replace('/'+domains[i]+'/gi','domain.xyz');
}
var target = 'http://'+domain;
simpleProxy.web(req, res, {
rewriteRules: true,
xfwd: true,
toProxy: true,
changeOrigin: true,
hostRewrite: true,
autoRewrite: true,
protocolRewrite: true,
target: target
});
}).listen(4000);
if you want to proxy to different servers you can make a dictionary and replace to that value
//Proxy domain1 => domain2, and domain3 => domain4
var domains = {'domain1.xyz':'domain2.xyz','domain3.xyz':'domain4.xyz'}
[...]
for (var i in domains) {
domain = domain.replace('/'+i+'/gi',domains[i]);
}
edit:
Replacing the links you really have to replace the links. Browser-sync uses Regex to parse the links out of the content and replace the link with a simple string.replace. The functions to look here is (https://github.com/BrowserSync/browser-sync/search?utf8=%E2%9C%93&q=rewriteLinks)

Related

Wildcard subdomain info sharing between node server and Nuxt/Vue client

We are building a multi_tenant solution with NodeJS/Express for the back end and VueJS/Nuxt for the front-end. Each tenant will get their own subdomain like x.mysite.com, y.mysite.com, etc.
How can we make both our back end and front-end read the subdomain name and share with each other?
I have some understanding that in the Vue client, we can read suvdomain using window.location. But I think that's too late. Is there a better way? And what about the node /express setup? How do we get the suvidhaon info there?
Note that Node/Express server is primarily an API to interface with database and for authentication.
Any help or insight to put us on the right path is appreciated.
I'm doing something similar in my app. My solution looks something like this...
Front End: In router.vue, I check the subdomain to see what routes to return using window.location.host. There is 3 options
no subdomain loads the original routes (mysite.com)
portal subdomain loads the portal routes (portal.mysite.com)
any other subdomain loads the routes for the custom client subdomain, which can be anything and is dynamic
My routes for situation #3 looks like this:
import HostedSiteHomePage from 'pages/hostedsite/hosted-site-home'
export const hostedSiteRoutes = [
{ path: '*', component: HostedSiteHomePage }
]
The asterisk means that any unmatched route will fallback to it.
In your fallback page (or any page), you will want this (beforeMount is the important part here):
beforeMount: function () {
var host = window.location.host
this.subdomain = host.split('.')[0]
if (this.subdomain === 'www') subdomain = host.split('.')[1]
this.fetchSiteContent()
},
methods: {
fetchSiteContent() {
if (!this.subdomain || this.subdomain === 'www') {
this.siteContentLoaded = true
this.errorLoadingSite = true
return
}
// send subdomain to the server and get back configuration object
http.get('/Site/LoadSite', { params: { site: this.subdomain } }).then((result) => {
if (result && result.data && result.data.success == true) {
this.siteContent = result.data.content
} else {
this.errorLoadingSite = true
}
this.siteContentLoaded = true
}).catch((err) => {
console.log("Error loading " + this.subdomain + "'s site", err)
this.errorLoadingSite = true
this.siteContentLoaded = false
})
},
}
I store a configuration object in json in the database for the subdomain, and return that to the client side for a matching subdomain then update the site to match the information/options in the config object.
Here is my router.vue
These domain names are supported:
mysite.com (loads main/home routes)
portal.mysite.com (loads routes specific to the portal)
x.mysite.com (loads routes that support dynamic subdomain, fetches config from server)
y.mysite.com (loads routes that support dynamic subdomain, fetches config from server)
localhost:5000 (loads main/home routes)
portal.localhost:5000 (loads routes specific to the portal)
x.localhost:5000 (loads routes that support dynamic subdomain, fetches config from server)
y.localhost:5000 (loads routes that support dynamic subdomain, fetches config from server)
import Vue from 'vue'
import VueRouter from 'vue-router'
// 3 different routes objects in routes.vue
import { portalRoutes, homeRoutes, hostedSiteRoutes } from './routes'
Vue.use(VueRouter);
function getRoutes() {
let routes;
var host = window.location.host
var subdomain = host.split('.')[0]
if (subdomain === 'www') subdomain = host.split('.')[1]
console.log("Subdomain: ", subdomain)
// check for localhost to work in dev environment
// another viable alternative is to override /etc/hosts
if (subdomain === 'mysite' || subdomain.includes('localhost')) {
routes = homeRoutes
} else if (subdomain === 'portal') {
routes = portalRoutes
} else {
routes = hostedSiteRoutes
}
return routes;
}
let router = new VueRouter({
mode: 'history',
routes: getRoutes()
})
export default router
As you can see I have 3 different set of routes, one of which is a set of routes that supports dynamic subdomains. I send a GET request to the server once i load the dynamic subdomain page and fetch a configuration object that tells the front end what that site should look like.

cors-anywhere Page Dependecies (i.e. css, js, etc)

I am working with cors-anywhere on my localhost. I have the following server.js file...
var host = process.env.HOST || '0.0.0.0';
var port = process.env.PORT || 1234;
var cors_proxy = require('cors-anywhere');
cors_proxy.createServer({
httpProxyOptions: {
secure: false
}
}).listen(port, host, function() {
console.log('listening...');
});
This works and is proxying the initial request when going to:
http://localhost:1234/https://proxy-domain/page
The issue is that on the page I am browsing/being proxied to there are dependent files (i.e., css, javascript, images, etc) that are not being loaded because they are not being proxied appropriately. Looking at my browsers network tab the dependent files are trying to be downloaded from...
http://localhost:12345/sample-image.gif
Really the url should be...
https://proxy-domain/sample-image.gif
How can I configure cors-anywhere to proxy all subsequent requests to the appropriate target url?
According to the documentation you can set redirectSameOrigin to true, which will redirect requests to the same origin instead of proxying them.
cors_proxy.createServer({
redirectSameOrigin: true,
httpProxyOptions: {
secure: false
}
})

Multiple SSL Certificates and HTTP/2 with Express.js

Scenario:
I have an express.js server which serves variations of the same static landing page based on where req.headers.host says the user is coming from - think sort of like A/B testing.
GET tulip.flower.com serves pages/flower.com/tulip.html
GET rose.flower.com serves pages/flower.com/rose.html
At the same time, this one IP is also responsible for:
GET potato.vegetable.com serving pages/vegetable.com/potato.html
It's important that these pages are served FAST, so they are precompiled and optimized in all sorts of ways.
The server now needs to:
Provide separate certificates for *.vegetables.com, *.fruits.com, *.rocks.net
Optionally provide no certificate for *.flowers.com
Offer HTTP2
The problem is that HTTP2 mandates a certificate, and there's now multiple certificates in play.
It appears that it's possible to use multiple certificates on one Node.js (and presumably by extension Express.js) server, but is it possible to combine it with a module like spdy, and if so, how?
Instead of hacking node, would it be smarter to pawn the task of sorting out http2 and SSL to nginx? Should the caching network like Imperva or Akamai handle this?
You can use also tls.createSecureContext, Nginx is not necassary.
MY example here:
const https = require("https");
const tls = require("tls");
const certs = {
"localhost": {
key: "./certs/localhost.key",
cert: "./certs/localhost.crt",
},
"example.com": {
key: "./certs/example.key",
cert: "./certs/example.cert",
ca: "./certs/example.ca",
},
}
function getSecureContexts(certs) {
if (!certs || Object.keys(certs).length === 0) {
throw new Error("Any certificate wasn't found.");
}
const certsToReturn = {};
for (const serverName of Object.keys(certs)) {
const appCert = certs[serverName];
certsToReturn[serverName] = tls.createSecureContext({
key: fs.readFileSync(appCert.key),
cert: fs.readFileSync(appCert.cert),
// If the 'ca' option is not given, then node.js will use the default
ca: appCert.ca ? sslCADecode(
fs.readFileSync(appCert.ca, "utf8"),
) : null,
});
}
return certsToReturn;
}
// if CA contains more certificates it will be parsed to array
function sslCADecode(source) {
if (!source || typeof (source) !== "string") {
return [];
}
return source.split(/-----END CERTIFICATE-----[\s\n]+-----BEGIN CERTIFICATE-----/)
.map((value, index: number, array) => {
if (index) {
value = "-----BEGIN CERTIFICATE-----" + value;
}
if (index !== array.length - 1) {
value = value + "-----END CERTIFICATE-----";
}
value = value.replace(/^\n+/, "").replace(/\n+$/, "");
return value;
});
}
const secureContexts = getSecureContexts(certs)
const options = {
// A function that will be called if the client supports SNI TLS extension.
SNICallback: (servername, cb) => {
const ctx = secureContexts[servername];
if (!ctx) {
log.debug(`Not found SSL certificate for host: ${servername}`);
} else {
log.debug(`SSL certificate has been found and assigned to ${servername}`);
}
if (cb) {
cb(null, ctx);
} else {
return ctx;
}
},
};
var https = require('https');
var httpsServer = https.createServer(options, (req, res) => { console.log(res, req)});
httpsServer.listen(443, function () {
console.log("Listening https on port: 443")
});
If you want test it:
edit /etc/hosts and add record 127.0.0.1 example.com
open browser with url https://example.com:443
Nginx can handle SSL termination nicely, and this will offload ssl processing power from your application servers.
If you have a secure private network between your nginx and application servers I recommend offloading ssl via nginx reverse proxy. In this practice nginx will listen on ssl, (certificates will be managed on nginx servers) then it will reverse proxy requests to application server on non ssl (so application servers dont require to have certificates on them, no ssl config and no ssl process burden).
If you don't have a secure private network between your nginx and application servers you can still use nginx as reverse proxy via configuring upstreams as ssl, but you will lose offloading benefits.
CDNs can do this too. They are basically reverse proxy + caching so I dont see a problem there.
Good read.
Let's Encrypt w/ Greenlock Express v3
I'm the author if Greenlock Express, which is Let's Encrypt for Node.js, Express, etc, and this use case is exactly what I made it for.
The basic setup looks like this:
require("greenlock-express")
.init(function getConfig() {
return {
package: require("./package.json")
manager: 'greenlock-manager-fs',
cluster: false,
configFile: '~/.config/greenlock/manager.json'
};
})
.serve(httpsWorker);
function httpsWorker(server) {
// Works with any Node app (Express, etc)
var app = require("./my-express-app.js");
// See, all normal stuff here
app.get("/hello", function(req, res) {
res.end("Hello, Encrypted World!");
});
// Serves on 80 and 443
// Get's SSL certificates magically!
server.serveApp(app);
}
It also works with node cluster so that you can take advantage of multiple cores.
It uses SNICallback to dynamically add certificates on the fly.
Site Management
The default manager plugin uses files on the file system, but there's great documentation on how to build your own.
Just to get started, the file-based plugin uses a config file that looks like this:
~/.config/greenlock/manager.json:
{
"subscriberEmail": "letsencrypt-test#therootcompany.com",
"agreeToTerms": true,
"sites": [
{
"subject": "example.com",
"altnames": ["example.com", "www.example.com"]
}
]
}
Very Extensible
I can't post all the possible options here, but it's very small and simple to start with, and very easy to scale out with advanced options as you need them.

Use variable subdomains for routing with wildcard

I want to create an express application that uses dynamic/variable subdomains for routing. This is what I want to do:
http://<username>.mysite.dev should forward the requests to the users/index.js
In the users/index.js I will access the username via req.subdomain[0]. It would be nice if I could also run an regular expression check on <username>.
My first approach:
I am using the node package express-subdomain to route the requests:
/* import node packages */
var express = require('express'),
subdomain = require('express-subdomain');
/* application config */
var app = express(),
port = process.env.PORT || 3000;
/* import all apps */
var users = require('./users/index.js');
/* route requests by subdomain */
app.use(subdomain('*', users));
app.get('/', function(req,res) {
/* Never get here */
res.send('Homepage');
});
/* run app on given port */
app.listen(3000, function() {
console.log('Listening on port ' + port + ' ...');
});
The problem with this approach is that the * is not working properly. It forwards all requests to my users/index.js even when there is no subdomain (http://mysite.dev). One solution for this problem would be, if I change the routing like this:
app.use(subdomain('*.users', users));
So I can access the users/index.js through http://<user>.users.mysite.dev and I can also reach the normal site, when there is no subdomain. But this approach is not really what I want - the users subdomain is too much. In addition I can not use regex.
Now, I am searching for a better solution for this problem.

How do you get RequireSSL in NodeJS on an Azure Website?

So a little background first: A NodeJS server running in an Azure Website will automatically have all HTTPS requests directed to the http endpoint. This allows the following code to work for both HTTP and HTTPS
var http = require('http');
var express = require('express');
var app = express();
// *snip*
http.createServer(app).listen(process.env.PORT);
// can go to http://*.azurewebsites.net or https://*.azurewebsites.net without issue
From here I decided to create a "RequireSSL" middleware
/* snip */
if (req.protocol === 'http') {
var origFullUrl: string = 'http://' + req.get('host') + req.originalUrl;
var u: url.Url = url.parse(origFullUrl, true);
u.host = null; // nead to clear so 'port' is used
u.protocol = 'https';
u.port = '443';
res.redirect(url.format(u));
}
/* snip */
Here's where the background comes into play. Because Azure redirects all HTTPS to the HTTP protocol the req.protocol always equals 'http' creating a redirect loop.
Anyone know of a way to get this to work in an Azure Website?
You can detect this using x-arr-ssl header.. Please go through this : https://coderead.wordpress.com/2014/09/05/redirecting-to-https-in-node-js-on-azure-websites/

Resources