Overriding a low level node.js module - node.js

Amazon S3 allows static website hosting, but with a requirement that the bucket name must match your domain name. This means your bucket name will look like: mydomain.com. Amazon S3 also provides a wildcard SSL certificate for *.s3.amazonaws.com. By the rules of TLS, this means com.s3.amazonaws.com IS covered by the certificate, but mybucket.com.s3.amazonaws.com is not. Node applications, like Knox that connect to *.com.s3.amazonaws.com should really be able to trust that certificate, even though it breaks the rules of TLS, since the knox library is a 'closed system': it only ever connects to an Amazon property.
The Node module https relies on tls.js, and tls.js has this function:
function checkServerIdentity(host, cert) {
...
// "The client SHOULD NOT attempt to match a presented identifier in
// which the wildcard character comprises a label other than the
// left-most label (e.g., do not match bar.*.example.net)."
// RFC6125
if (!wildcards && /*/.test(host) || /[.*].**/.test(host) ||
/*/.test(host) && !/*.*..+..+/.test(host)) {
return /$./;
}
Which will properly return a "Certificate Mismatch" error. Can the upper level Knox module override the checkServerIdentity function, which is several levels down and not called directly by Knox? I know how to override a function in a library I require, but not libraries that are included by these libraries.

There is a global cache for modules, which means that any function you override will be modified for all other modules. I think you can include tls yourself and patch checkServerIdentity:
// main.js
var tls = require('tls'),
mod = require('./mod.js');
tls.checkServerIdentity = function (host, cert) {
return true;
};
mod.test();
// mod.js
var tls = require('tls');
exports.test = function () {
console.log(tls.checkServerIdentity()); // true
};

If you don't want to make changes to the global module objects (per your comment on Nik's answer), maybe you could use the rewire module. I imagine doing it something like this:
var knoxModule = rewire("./node_modules/knox/somefile.js");
knoxModule.__set__("tls", {
checkServerIdentity: function (host, cert) {
// some code
}
});
I haven't ever worked with it though.

Related

How does one secure api keys on sveltekit 1.0

I am using ghost, i made an integration and i would like to hide the api key from the front-end. I do not believe i can set restrictions on the ghost cms (that would also work). And i do believe so +page.js files are run on the browser also, so im a little confused on how to achieve this?
The interal sveltekit module $env/static/private (docs) is how you use secure API keys. Sveltekit will not allow you to import this module into client code so it provides an extra layer of safety. Vite automatically loads your enviroment variables from .env files and process.env on build and injects your key into your server side bundle.
import { API_KEY } from '$env/static/private';
// Use your secret
Sveltekit has 4 modules for accessing enviroment variables
$env/static/private (covered)
$env/static/public accessiable by server and client and injected at build (docs)
$env/dynamic/private provided by your runtime adapter; only includes variables with that do not start with the your public prefix which defaults to PUBLIC_ and can only be imported by server files (docs)
$env/dynamic/public provided by your runtime adapter; only includes variables with that do start with the your public prefix which defaults to PUBLIC_ (docs)
You don't need to hide the key.
Ghost Content API Docs:
These keys are safe for use in browsers and other insecure environments, as they only ever provide access to public data.
One common way to hide your third-party API key(s) from public view is to set up proxy API routes.
The general idea is to have your client (browser) query a proxy API route that you provide/host, have that proxy route query the third-party API using your credentials (API key), and pass on the results from the third-party API back to the client.
Because the query to the third-party API takes place exclusively on the back-end, your credentials are never exposed to the client (browser) and thus not visible to the public.
In your use case, you would have to create 3 dynamic endpoint routes to replicate the structure of Ghost's API:
src/routes/api/[resource]/+server.js to match /posts/, /authors/, /tags/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/[id]/+server.js to match /posts/{id}/, /authors/{id}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, id } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/${id}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
src/routes/api/[resource]/slug/[slug]/+server.js to match /posts/slug/{slug}/, /authors/slug/{slug}/, etc.:
const API_KEY = <your_api_key>; // preferably pulled from ENV
const GHOST_URL = `https://<your_ghost_admin_domain>/ghost/api/content`;
export function GET({ params, url }) {
const { resource, slug } = params;
const queryString = url.searchParams.toString();
return fetch(`${GHOST_URL}/${resource}/slug/${slug}/?key=${API_KEY}${queryString ? `&${queryString}` : ''}`, {
headers: {
'Accept-Version': '5.0' // Ghost API Version setting
}
});
}
Then all you have to do is call your proxy routes in place of your original third-party API routes in your app:
// very barebones example
<script>
let uri;
let data;
async function get() {
const res = await fetch(`/api/${uri}`);
data = await res.json();
}
</script>
<input name="uri" bind:value={uri} />
<button on:click={get}>GET</button>
{data}
Note that using proxy API routes will also have the additional benefit of sidestepping potential CORS issues.

Is declaring a Node.js redis client as a const in multiple helpers a safe way to use it?

This is a little hard articulate so I hope my title isn't too terrible.
I have a frontend/backend React/Node.js(REST API) Web app that I want to add Redis support to for storing retrieving app global settings and per-user specific settings (like language preference, last login, etc... simple stuff) So I was considering adding a /settings branch to my backend REST API to push/pull this information from a redis instance.
This is where my Node.js inexperience comes through. I'm looking at using the ioredis client and it seems too easy. If I have a couple of helpers (more than one .js which will call upon redis) will constructing the client as a const in each be safe to do? Or is there another recommended approach to reusing a single instance of it be the way to go?
Here's a sample of what I'm thinking of doing. Imagine if I had 3 helper modules that require access to the redis client. Should I declare them as const in each? Or centralize them in a single helper module, and get the client from it? Is there a dis-advantage to doing either?
const config = require('config.json');
const redis_url = config.redis_url;
//redis setup
const Redis = require('ioredis');
const redis = new Redis(redis_url);
module.exports = {
test
}
async function test(id) {
redis.get(id, function (err, result) {
if (err) {
console.error(err);
throw(err);
} else {
return result;
}
});
Thank you.
If no redis conflicts...
If the different "helper" modules you are referring to have no conflicts when interacting with redis, such as overwriting / using the same redis keys, then I can't see any reason not to use the same redis instance (as outlined by garlicman) and export this to the different modules in which it is used.
Otherwise use separate redis databases...
If you do require separate redis database connections, redis ships with 16 databases so you can specify which to connect to when creating a new instance - see below:
const redis = new Redis({ // SET UP CONFIG FOR CONNECTION TO REDIS
port: 6379, // Redis port
host: 127.0.0.1, // Redis host
family: 4, // 4 (IPv4) or 6 (IPv6)
db: 10, // Redis database to connect to
});
Normally what I would do (in Java say) is implement any explicit class with singleton access the hold the connection and any connection error/reconnect handling.
All modules in Node.js are already singletons I believe, but what I will probably go with will be a client class to hold it and my own access related methods. Something like:
const config = require('config.json');
const Redis = require("ioredis");
class Redis {
constructor(){
client = new Redis(config.redis_url);
}
get(key) {
return this.client.get(key);
}
set(key, value, ttl) {
let rp;
if (ttl === 0) {
rp = this.client.set(key, value);
}
else {
rp = this.client.set(key, value)
.then(function(res) {
this.client.expire(key, ttl);
});
}
return rp;
}
}
module.exports = new Redis;
I'll probably include a data_init() method to check and preload an initial key/value structure on first connect.

Multiple SSL Certificates and HTTP/2 with Express.js

Scenario:
I have an express.js server which serves variations of the same static landing page based on where req.headers.host says the user is coming from - think sort of like A/B testing.
GET tulip.flower.com serves pages/flower.com/tulip.html
GET rose.flower.com serves pages/flower.com/rose.html
At the same time, this one IP is also responsible for:
GET potato.vegetable.com serving pages/vegetable.com/potato.html
It's important that these pages are served FAST, so they are precompiled and optimized in all sorts of ways.
The server now needs to:
Provide separate certificates for *.vegetables.com, *.fruits.com, *.rocks.net
Optionally provide no certificate for *.flowers.com
Offer HTTP2
The problem is that HTTP2 mandates a certificate, and there's now multiple certificates in play.
It appears that it's possible to use multiple certificates on one Node.js (and presumably by extension Express.js) server, but is it possible to combine it with a module like spdy, and if so, how?
Instead of hacking node, would it be smarter to pawn the task of sorting out http2 and SSL to nginx? Should the caching network like Imperva or Akamai handle this?
You can use also tls.createSecureContext, Nginx is not necassary.
MY example here:
const https = require("https");
const tls = require("tls");
const certs = {
"localhost": {
key: "./certs/localhost.key",
cert: "./certs/localhost.crt",
},
"example.com": {
key: "./certs/example.key",
cert: "./certs/example.cert",
ca: "./certs/example.ca",
},
}
function getSecureContexts(certs) {
if (!certs || Object.keys(certs).length === 0) {
throw new Error("Any certificate wasn't found.");
}
const certsToReturn = {};
for (const serverName of Object.keys(certs)) {
const appCert = certs[serverName];
certsToReturn[serverName] = tls.createSecureContext({
key: fs.readFileSync(appCert.key),
cert: fs.readFileSync(appCert.cert),
// If the 'ca' option is not given, then node.js will use the default
ca: appCert.ca ? sslCADecode(
fs.readFileSync(appCert.ca, "utf8"),
) : null,
});
}
return certsToReturn;
}
// if CA contains more certificates it will be parsed to array
function sslCADecode(source) {
if (!source || typeof (source) !== "string") {
return [];
}
return source.split(/-----END CERTIFICATE-----[\s\n]+-----BEGIN CERTIFICATE-----/)
.map((value, index: number, array) => {
if (index) {
value = "-----BEGIN CERTIFICATE-----" + value;
}
if (index !== array.length - 1) {
value = value + "-----END CERTIFICATE-----";
}
value = value.replace(/^\n+/, "").replace(/\n+$/, "");
return value;
});
}
const secureContexts = getSecureContexts(certs)
const options = {
// A function that will be called if the client supports SNI TLS extension.
SNICallback: (servername, cb) => {
const ctx = secureContexts[servername];
if (!ctx) {
log.debug(`Not found SSL certificate for host: ${servername}`);
} else {
log.debug(`SSL certificate has been found and assigned to ${servername}`);
}
if (cb) {
cb(null, ctx);
} else {
return ctx;
}
},
};
var https = require('https');
var httpsServer = https.createServer(options, (req, res) => { console.log(res, req)});
httpsServer.listen(443, function () {
console.log("Listening https on port: 443")
});
If you want test it:
edit /etc/hosts and add record 127.0.0.1 example.com
open browser with url https://example.com:443
Nginx can handle SSL termination nicely, and this will offload ssl processing power from your application servers.
If you have a secure private network between your nginx and application servers I recommend offloading ssl via nginx reverse proxy. In this practice nginx will listen on ssl, (certificates will be managed on nginx servers) then it will reverse proxy requests to application server on non ssl (so application servers dont require to have certificates on them, no ssl config and no ssl process burden).
If you don't have a secure private network between your nginx and application servers you can still use nginx as reverse proxy via configuring upstreams as ssl, but you will lose offloading benefits.
CDNs can do this too. They are basically reverse proxy + caching so I dont see a problem there.
Good read.
Let's Encrypt w/ Greenlock Express v3
I'm the author if Greenlock Express, which is Let's Encrypt for Node.js, Express, etc, and this use case is exactly what I made it for.
The basic setup looks like this:
require("greenlock-express")
.init(function getConfig() {
return {
package: require("./package.json")
manager: 'greenlock-manager-fs',
cluster: false,
configFile: '~/.config/greenlock/manager.json'
};
})
.serve(httpsWorker);
function httpsWorker(server) {
// Works with any Node app (Express, etc)
var app = require("./my-express-app.js");
// See, all normal stuff here
app.get("/hello", function(req, res) {
res.end("Hello, Encrypted World!");
});
// Serves on 80 and 443
// Get's SSL certificates magically!
server.serveApp(app);
}
It also works with node cluster so that you can take advantage of multiple cores.
It uses SNICallback to dynamically add certificates on the fly.
Site Management
The default manager plugin uses files on the file system, but there's great documentation on how to build your own.
Just to get started, the file-based plugin uses a config file that looks like this:
~/.config/greenlock/manager.json:
{
"subscriberEmail": "letsencrypt-test#therootcompany.com",
"agreeToTerms": true,
"sites": [
{
"subject": "example.com",
"altnames": ["example.com", "www.example.com"]
}
]
}
Very Extensible
I can't post all the possible options here, but it's very small and simple to start with, and very easy to scale out with advanced options as you need them.

AWS-SDK for node js connection management

Does aws-sdk for node js manage it's connections through an internal pool?
Their documentation kind of leads me to believe that.
httpOptions (map) — A set of options to pass to the low-level HTTP
request. Currently supported options are:
proxy [String] — the URL to proxy requests through agent [http.Agent,
https.Agent] — the Agent object to perform HTTP requests with. Used
for connection pooling. Defaults to the global agent
(http.globalAgent) for non-SSL connections. Note that for SSL
connections, a special Agent object is used in order to enable peer
certificate verification. This feature is only available in the
Node.js environment.
But there's no way, at least none that I could find, that'd let me define any connection pool properties.
What are my options if I want to control the concurrent connections in use?
Is it better to let the SDK handle that?
can give the http.Agent with whatever settings you want for max sockets.
var AWS = require('aws-sdk');
var http = require('http');
AWS.config.update({
httpOptions: {
agent: new http.Agent(...)
}
})
I have been looking into this a little bit more.
I dug around and figured out the defaults being used.
AWS-SDK is using the node http module, of which the defaultSocketCount is INFINITY.
They are using https module under the wraps with a maxSocketCount of 50.
The relevant code snippet.
sslAgent: function sslAgent() {
var https = require('https');
if (!AWS.NodeHttpClient.sslAgent) {
AWS.NodeHttpClient.sslAgent = new https.Agent({rejectUnauthorized: true});
AWS.NodeHttpClient.sslAgent.setMaxListeners(0);
// delegate maxSockets to globalAgent, set a default limit of 50 if current value is Infinity.
// Users can bypass this default by supplying their own Agent as part of SDK configuration.
Object.defineProperty(AWS.NodeHttpClient.sslAgent, 'maxSockets', {
enumerable: true,
get: function() {
var defaultMaxSockets = 50;
var globalAgent = https.globalAgent;
if (globalAgent && globalAgent.maxSockets !== Infinity && typeof globalAgent.maxSockets === 'number') {
return globalAgent.maxSockets;
}
return defaultMaxSockets;
}
});
}
return AWS.NodeHttpClient.sslAgent;
}
For manipulating the socket counts, see BretzL's answer.
There is however now way to set the agent for both http and https at once. You can work around this by updating the configuration as you switch from http to https and vice versa.
See : https://github.com/aws/aws-sdk-js/issues/1185

Explain to Mean.io beginner how Mean.io sample package's authentication works

I'm learning mean.io from this tutorial video, which shows the example package (created by mean package mymodule. It is also described under "Packages" on the docs). I would like help in understanding how the given authentication/authorization works.
The default sample package/module has a simple user authentication that on the client side
myapp/packages/mymodule/public/views/index.html contains:
<li>
Server route that anyone can access
</li>
<li>
Server route that requires authentication
</li>
<li>
Server route that requires admin user
</li>
On the server side,
myapp/packages/mymodule/server/routes/mymodule.js, contains:
// The Package is past automatically as first parameter
module.exports = function(Mymodule, app, auth, database) {
app.get('/mymodule/example/anyone', function(req, res, next) {
res.send('Anyone can access this');
});
app.get('/mymodule/example/auth', auth.requiresLogin, function(req, res, next) {
res.send('Only authenticated users can access this');
});
app.get('/mymodule/example/admin', auth.requiresAdmin, function(req, res, next) {
res.send('Only users with Admin role can access this');
});
...
};
The magic of the different authentication relies on the second argument of app.get() with additional authentication callback: none, auth.requiresLogin, or auth.requiresAdmin.
This is the authentication magic (also on github):
myapp/packages/access/server/config/authorization.js:
/**
* Generic require login routing middleware
*/
exports.requiresLogin = function(req, res, next) {
if (!req.isAuthenticated()) {
return res.send(401, 'User is not authorized');
}
next();
};
/**
* Generic require Admin routing middleware
* Basic Role checking - future release with full permission system
*/
exports.requiresAdmin = function(req, res, next) {
if (!req.isAuthenticated() || !req.user.hasRole('admin')) {
return res.send(401, 'User is not authorized');
}
next();
};
QUESTION A: Why is it "exports.requiresLogin" and "exports.requiresAdmin" in the authorization.js instead of "somethingelse.requiresLogin" and "somethingelse.requiresAdmin"? Is this "exports" related to the myapp/packages/access/server/config/passport.js's exports: module.exports = function(passport) { ...}, github? If so, in what circumstances can we use this "exports"?
Since authentication's authorization rules is written up in package "access" and used in package "mymodule", Mean.io packages are not independent of each other. The Access package is registered on
myapp/packages/access/app.js, github:
var mean = require('meanio'),
Module = mean.Module,
passport = require('passport');
var Access = new Module('access');
Access.register(function(database) {
// Register auth dependency
var auth = require('./server/config/authorization');
require('./server/config/passport')(passport);
// This is for backwards compatibility
mean.register('auth', function() {
return auth;
});
mean.register('passport', function() {
return passport;
});
Access.passport = passport;
Access.middleware = auth;
return Access;
});
QUESTION B: Does Mean.io automatically link all the packages or is there code to link packages somewhere? Is it linked due to the part with "This is for backwards compatibility" shown below? If so, where can "auth" be used? All the packages myapp/packages/? How about in the mean.io base app directory myapp/?
var auth = require('./server/config/authorization');
// This is for backwards compatibility
mean.register('auth', function() {
return auth;
});
QUESTION C: Why is it "Access.passport = passport;", but "middleware" for "Access.middleware = auth;"? What what happen if it were "Access.auth = auth"?
REGARDING QUESTION A (on the use of exports)
In Node.js, assigning values to the exports object makes those values available to the code that requires the source file.
For example, given file foo.js:
exports.foo = "FOO";
exports.bar = "BAR";
and file main.js:
var foo = require('foo.js');
console.log('foo=',foo.foo,'; bar=',foo.bar);
running node main.js will output foo= FOO ; bar= BAR.
See, for example, Node's module documentation or this write-up on require and exports.
REGARDING QUESTION B (on package "linking")
The answer to this question is the complement to the answer to Question A.
There is code to "link" the packages. It is the require statement.
In your app.js source code, the first line (reading var mean = require('meanio')) will set the local variable mean to whatever values are assigned to exports object is when meanio.js and/or the meanio module is loaded.
Same with passport = require('passport'). In that case, the local variable passport will be equal to the value of exports after index.js in the passport module is loaded.
REGARDING QUESTION C
I'm not entirely sure what you are asking here, but let me take a stab at it.
In this case:
1) var mean = require('meanio') in line 1 "imports" the meanio module, such that the local variable mean is more or less set equal to the value of exports in the meanio module.
2) Module = mean.Module in line 2 sets the local variable Module equal to the value of mean.Module, which must have been assigned in the meanio module.
3) var Access = new Module('access') is instantiating an instance of the Module class, assigning it to the local variable Access.
4) Access.passport = passport assigns the instance variable named passport within the instance of meanio.Module named Access (to the value of the passport module required on line #3)
5) Access.middleware = auth assigns the instance variable named middleward within the instance of meanio.Module named Access (to the value returned by require('./server/config/authorization') in line 11).
I'm not familiar with the "meanio" module, but based on this code it looks like you are configuring the meanio.Module("access") instance (named Access) by assigning specific "magic" variable names.
In other words, rather than Access.passport = passport; Access.middleware = auth you might have Access.setPassport(passport); Access.setMiddleware(auth) or (rather than line 5) var Access = new Module('access',passport,auth).
That is, the author of the "meanio" module seems to have decided to use special variable names to configure the class rather than "setter" methods or parameters passed to the constructor. I assume that somewhere in the meanio code you'll find a reference to something like this.middleware and this.passport, where the code is assuming you have "filled in" those instance variables as happens in the last few lines in your code sample.
If you were to add Access.auth = auth then all that would happen is that the Access object would have a new attributed named auth whose value is equal to the that of the local variable auth.
If you used Access.auth instead of Access.middleware, I assume whatever code in the Access class that is using this.middleware will fail, since no value was ever assigned to Access.middleware and Access.auth is not one of the "magic" variable names that meanio is looking for.

Resources