cors-anywhere Page Dependecies (i.e. css, js, etc) - node.js

I am working with cors-anywhere on my localhost. I have the following server.js file...
var host = process.env.HOST || '0.0.0.0';
var port = process.env.PORT || 1234;
var cors_proxy = require('cors-anywhere');
cors_proxy.createServer({
httpProxyOptions: {
secure: false
}
}).listen(port, host, function() {
console.log('listening...');
});
This works and is proxying the initial request when going to:
http://localhost:1234/https://proxy-domain/page
The issue is that on the page I am browsing/being proxied to there are dependent files (i.e., css, javascript, images, etc) that are not being loaded because they are not being proxied appropriately. Looking at my browsers network tab the dependent files are trying to be downloaded from...
http://localhost:12345/sample-image.gif
Really the url should be...
https://proxy-domain/sample-image.gif
How can I configure cors-anywhere to proxy all subsequent requests to the appropriate target url?

According to the documentation you can set redirectSameOrigin to true, which will redirect requests to the same origin instead of proxying them.
cors_proxy.createServer({
redirectSameOrigin: true,
httpProxyOptions: {
secure: false
}
})

Related

How to determine http vs https in nodejs / nextjs api handler

In order to properly build my urls in my xml sitemaps and rss feeds I want to determine if the webpage is currently served over http or https, so it also works locally in development.
export default function handler(req, res) {
const host = req.headers.host;
const proto = req.connection.encrypted ? "https" : "http";
//construct url for xml sitemaps
}
With above code however also on Vercel it still shows as being served over http. I would expect it to run as https. Is there a better way to figure out http vs https?
As Next.js api routes run behind a proxy which is offloading to http the protocol is http.
By changing the code to the following I was able to first check at what protocol the proxy runs.
const proto = req.headers["x-forwarded-proto"];
However this will break the thing in development where you are not running behind a proxy, or a different way of deploying the solution that might also not involve a proxy. To support both use cases I eventually ended up with the following code.
const proto =
req.headers["x-forwarded-proto"] || req.connection.encrypted
? "https"
: "http";
Whenever the x-forwarded-proto header is not present (undefined) we fall back to req.connection.encrypted to determine if we should serve on http vs https.
Now it works on localhost as well a Vercel deployment.
my solution:
export const getServerSideProps: GetServerSideProps = async (context: any) => {
// Fetch data from external API
const reqUrl = context.req.headers["referer"];
const url = new URL(reqUrl);
console.log('====================================');
console.log(url.protocol); // http
console.log('====================================');
// const res = await fetch(`${origin}/api/projets`)
// const data = await res.json()
// Pass data to the page via props
return { props: { data } }
}

Connecting to Websocket in OpenShift Online Next Gen Starter

I'm in the process of trying to get an application which I'd built on the old OpenShift Online 2 free service up and running on the new OpenShift Online 3 Starter, and I'm having a bit of trouble.
The application uses websocket, and in the old system all that was required was for the client to connect to my server on port 8443 (which was automatically routed to my server). That doesn't seem to work in the new setup however - the connection just times out - and I haven't been able to find any documentation about using websocket in the new system.
My first thought was that I needed an additional rout, but 8080 is the only port option available for routing as far as I can see.
The app lives here, and the connection is made on line 21 of this script with the line:
this.socket = new WebSocket( 'wss://' + this.server + ':' + this.port, 'tabletop-protocol' );
Which becomes, in practice:
this.socket = new WebSocket( 'wss://production-instanttabletop.7e14.starter-us-west-2.openshiftapps.com:8443/', 'tabletop-protocol' );
On the back end, the server setup is unchanged from what I had on OpenShift 2, aside from updating the IP and port lookup from env as needed, and adding logging to help diagnose the issues I've been having.
For reference, here's the node.js server code (with the logic trimmed out):
var http = require( "http" );
var ws = require( "websocket" ).server;
// Trimmed some others used by the logic...
var ip = process.env.IP || process.env.OPENSHIFT_NODEJS_IP || '0.0.0.0';
var port = process.env.PORT || process.env.OPENSHIFT_NODEJS_PORT || 8080;
/* FILE SERVER */
// Create a static file server for the client page
var pageHost = http.createServer( function( request, response ){
// Simple file server that seems to be working, if a bit slowly
// ...
} ).listen( port, ip );
/* WEBSOCKET */
// Create a websocket server for ongoing communications
var wsConnections = [];
wsServer = new ws( { httpServer: pageHost } );
// Start listening for events on the server
wsServer.on( 'request', function( request ){
// Server logic for the app, but nothing in here ever gets hit
// ...
} );
In another question it was suggested that nearly anything - including this -
could be related to the to the ongoing general issues with US West 2, but other related problems I was experiencing seem to have cleared, and that issue has been posted for a week with no update, so I figured I'd dig deeper into this on the assumption that it's something I'm doing wrong instead of them.
Anyone know more about this and what I need to do to make it work?

Proxy subdomain of one domain the subdomain of another domain

I want to setup a basic proxy like the one browser-sync has, for example I want to go to web1.domain1.xyz and it should proxy to web1.domain.xyz (so it's anything.domain1.xyz to anything.domain.xyz)
I got this part working already :
// HTTP Proxy
var simpleHttp = require("http");
var simpleHttpProxy = require("http-proxy");
var simpleProxy = simpleHttpProxy.createServer();
simpleHttp.createServer(function(req,res){
var target = 'http://'+req.headers.host.replace(/domain1.xyz|domain2.xyz|domain3.xyz/gi,'domain.xyz');
simpleProxy.web(req, res, {
rewriteRules: true,
xfwd: true,
toProxy: true,
changeOrigin: true,
hostRewrite: true,
autoRewrite: true,
protocolRewrite: true,
target: target
});
}).listen(4000);
I would also like to be able to point multiple domains at it, for example domain1.com domain2.com domain3.com.
However it doesn't replace the links, I don't know how browser-sync does this, I've been studying https://github.com/BrowserSync/browser-sync/blob/master/lib/server/proxy-server.js and https://github.com/BrowserSync/browser-sync/blob/master/lib/server/proxy-utils.js
I see that they use a custom function to replace the links, however I didn't succeed in implementing their logic.
I'm not sure how to do what you're asking for with http-proxy but I'm almost certain it can be done with express-vhost. There's even a suitable example in the README. I've always found express-vhost to be very light and functional.
var connect = require('connect')
var serveStatic = require('serve-static')
var vhost = require('vhost')
var mailapp = connect()
// add middlewares to mailapp for mail.example.com
// create app to serve static files on subdomain
var staticapp = connect()
staticapp.use(serveStatic('public'))
// create main app
var app = connect()
// add vhost routing to main app for mail
app.use(vhost('mail.example.com', mailapp))
// route static assets for "assets-*" subdomain to get
// around max host connections limit on browsers
app.use(vhost('assets-*.example.com', staticapp))
// add middlewares and main usage to app
app.listen(3000)
Edit: i thought you had a working example. Didn't understand the question correctly :-/ read further down:
You can chain the replace calls
// HTTP Proxy
var simpleHttp = require("http");
var simpleHttpProxy = require("http-proxy");
var simpleProxy = simpleHttpProxy.createServer();
var domains = ['domain1.xyz','domain2.xyz','domain3.xyz'];
simpleHttp.createServer(function(req,res){
// Calling replace for every entry in domains
var domain = req.headers.host;
for (var i in domains) {
domain = domain.replace('/'+domains[i]+'/gi','domain.xyz');
}
var target = 'http://'+domain;
simpleProxy.web(req, res, {
rewriteRules: true,
xfwd: true,
toProxy: true,
changeOrigin: true,
hostRewrite: true,
autoRewrite: true,
protocolRewrite: true,
target: target
});
}).listen(4000);
if you want to proxy to different servers you can make a dictionary and replace to that value
//Proxy domain1 => domain2, and domain3 => domain4
var domains = {'domain1.xyz':'domain2.xyz','domain3.xyz':'domain4.xyz'}
[...]
for (var i in domains) {
domain = domain.replace('/'+i+'/gi',domains[i]);
}
edit:
Replacing the links you really have to replace the links. Browser-sync uses Regex to parse the links out of the content and replace the link with a simple string.replace. The functions to look here is (https://github.com/BrowserSync/browser-sync/search?utf8=%E2%9C%93&q=rewriteLinks)

Use variable subdomains for routing with wildcard

I want to create an express application that uses dynamic/variable subdomains for routing. This is what I want to do:
http://<username>.mysite.dev should forward the requests to the users/index.js
In the users/index.js I will access the username via req.subdomain[0]. It would be nice if I could also run an regular expression check on <username>.
My first approach:
I am using the node package express-subdomain to route the requests:
/* import node packages */
var express = require('express'),
subdomain = require('express-subdomain');
/* application config */
var app = express(),
port = process.env.PORT || 3000;
/* import all apps */
var users = require('./users/index.js');
/* route requests by subdomain */
app.use(subdomain('*', users));
app.get('/', function(req,res) {
/* Never get here */
res.send('Homepage');
});
/* run app on given port */
app.listen(3000, function() {
console.log('Listening on port ' + port + ' ...');
});
The problem with this approach is that the * is not working properly. It forwards all requests to my users/index.js even when there is no subdomain (http://mysite.dev). One solution for this problem would be, if I change the routing like this:
app.use(subdomain('*.users', users));
So I can access the users/index.js through http://<user>.users.mysite.dev and I can also reach the normal site, when there is no subdomain. But this approach is not really what I want - the users subdomain is too much. In addition I can not use regex.
Now, I am searching for a better solution for this problem.

How to redirect multiple subdomains to the same running express app

I'm building a SaaS app in NodeJS and using the Express framework. The individual members of the website has a url with a custom subdomain to login.
For example, a company called ABC Corp may login at abc.example.com and another company called Sony may login at sony.example.com
Any idea how I can redirect/route multiple subdomains to the same app instance?
You can use the express-subdomain package. Assuming you have a routes folder, containing abc.js and sony.js files that respectively export login routes for the abc and sony subdomains, you could have the following in index.js or whatever file from which your express server is listening.
const subdomain = require('express-subdomain');
const express = require('express');
const app = express();
const abcRoutes = require('./routes/abc');
const sonyRoutes = require('./routes/sony');
const port = process.env.PORT || 3000;
// use the subdomain middleware
app.use(subdomain('abc', abcRoutes));
app.use(subdomain('sony', sonyRoutes));
// a simple get route on the top-level domain
app.get('/', (req, res) => {
res.send('Welcome to the Home Page!');
});
// add any other needed routes
module.exports = app.listen(port, () => {
console.log('Server listening on port ' + port);
});
Your server will then be live and working as expected
http://example.com/ --> Welcome to the Home Page!
http://abc.example.com/login --> (Your login page for abc)
http://sony.example.com/login --> (Your login page for sony)
To tests subdomains locally you need to add them to your /etc/hosts file. (it requires sudo permissions)
127.0.0.1 localhost
127.0.0.1 example.com
127.0.0.1 abc.example.com
127.0.0.1 sony.example.com
The equivalent for /etc/hosts file on windows is at %systemroot%\system32\drivers\etc
For better details on setting up localhost domains locally check here
You can do more with the subdomain package. It accepts wildcards and you can use it to check API keys if you need such a feature.
Checkout the documentation for the express-subdomain package at https://npmjs.com/package/express-subdomain
You can actually handle that particular route or a wide range then go for Reg Exp (which allows you to do this app.get(new RegExp('(your|string)\/here'), function…) which you want to redirect and then follow the redirecting action something like below code is doing:
response.writeHead(302, {
'Location': 'yourpath/page.html'
//add other headers here...
});
response.end();
Update 1 : [as per the comments and other updates]
Then you try to handle all requests with the following app:
express()
.use(express.vhost('abc.example.com', require('/path/to/loginApp').app))
.use(express.vhost('sony.example.com', require('/path/to/loginApp').app))
.listen(80)
where /path/to/loginApp can be absolute paths or relative paths.
I hope this solves your problem.
Update 2:
Actually when a request comes in the request event is raised on a HTTP Server. So basically it is handled by express, express.vhost is a middle-ware function which raises the request event on another instance of a HTTP Server, that is how it works.
Below is the code:
function vhost(req, res, next){
if (!req.headers.host) return next();
var host = req.headers.host.split(':')[0];
if (req.subdomains = regexp.exec(host)) {
req.subdomains = req.subdomains[0].split('.').slice(0, -1);
server.emit('request', req, res);
} else {
next();
}
};

Resources