Node.js leaking path info, how to solve it? - security

I have a webserver running... and if I curl from another server. Something like that:
curl http://myserver.com/../../../../../etc/rsyslog.conf
then I can see the server info.
Is that a known problem?
UPDATE
here is my server code:
app = express.createServer(
gzip.staticGzip(__dirname + '/public', {maxAge:5000 }),
express.cookieParser(),
express.bodyParser()
);
got a fix like that:
var urlSecurity = function () {
return function (req, res, next) {
if (req.url.indexOf('../') >=0) {
res.send('<div>Server Error</div>' , 500);
} else if (req.url.indexOf('/..') >=0) {
res.send('<div>Server Error</div>' , 500);
} else {
next();
}
}
}
app = express.createServer(
urlSecurity (),
gzip.staticGzip(__dirname + '/public', {maxAge:5000 }),
express.cookieParser(),
express.bodyParser()
);
is this good enough?

You have a serious security flaw in your program. Fix it immediately.
My best guess from the presented symptom is that you're doing something like:
http.createServer(function (request, response) {
var file = path.resolve('/path/to/files', request.url)
fs.createReadStream(file).pipe(response)
})
This is extremely unwise! Always sanitize user input. In this case, it's quite easy:
http.createServer(function (request, response) {
var requestedFile = path.join('/', request.url);
var file = path.join('/path/to/files', requestedFile)
fs.createReadStream(file).pipe(response)
})
So, first we path.join the requested url onto '/'. This will et rid of any .. shenanigans, making it more sanitary. Then, we path.join that onto our url.
Why use path.join rather than path.resolve in this case? Because path.join just joins path parts, rather than resolving them, so a leading / won't have any ill effects.

After the immediate fix, I have done a lot of testing. and I confirm the following:
It is NOT a node problem primarily. It is the gzippo module causing the problem. Gzippo 0.1.3 is causing that problem. 0.1.4 has no problem. NOt sure why is like that. but better not to use the older version of gzippo.

The simplest solution is insecureFileName.split('/').pop() will always returns only fileName.
'index.html'.split('/').pop() => 'index.html'
'../../../index.html'.split('/').pop() => 'index.html'

I sanitize user filenames with:
path.basename(filename);
e.g.:
const path = require('path');
let filename = '../../../../../../../etc/passwd';
filename = path.basename(filename); // 'passwd'
let pathToFile = path.join('/path/from/config/to', filename);
console.log(pathToFile); // 'path/from/config/to/passwd'

Related

fs.createReadStream getting a different path than what's being passed in

I'm using NodeJS on a VM. One part of it serves up pages, and another part is an API. I've run into a problem, where fs.createReadStream attempts to access a different path than what is being passed into the function. I made a small test server to see if it was something else in the server affecting path usage, for whatever reason, but it's happening on my test server as well. First, here's the code:
const fs = require('fs');
const path = require('path');
const csv = require('csv-parser');
const readCSV = (filename) => {
console.log('READ CSV GOT ' + filename); // show me what you got
return new Promise((resolve, reject) => {
const arr = [];
fs.createReadStream(filename)
.pipe(csv())
.on('data', row => {
arr.push(row);
})
.on('error', err => {
console.log(err);
})
.on('end', () => {
resolve(arr);
});
}
}
// tried this:
// const dir = path.relative(
// path.join('path', 'to', 'this', 'file),
// path.join('path', 'to', 'CONTENT.csv')
// );
// tried a literal relative path:
// const dir = '../data/CONTENT.csv';
// tried a literal absolute path:
// const dir = '/repo/directory/server/data/CONTENT.csv';
// tried an absolute path:
const dir = path.join(__dirname, 'data', 'CONTENT.csv');
const content = readCSV(dir)
.then(result => {console.log(result[0]);})
.catch(err => {console.log(err);});
...but any way I slice it, I get the following output:
READCSV GOT /repo/directory/server/data/CONTENT.csv
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open '/repo/directory/data/CONTENT.csv'
i.e., is fs.createReadStream somehow stripping out the directory of the server, for some reason? I suppose I could hard code the directory into the call to createReadStream, maybe? I just want to know why this is happening.
Some extra: I'm stuck on node v8.11, can't go any higher. On the server itself, I believe I'm using older function(param) {...} instead of arrow functions -- but the behavior is exactly the same.
Please help!!
Code is perfect working.
I think you file CONTENT.csv should be in data folder like "/repo/directory/data/CONTENT.csv".
I'm answering my own question, because I found an answer, I'm not entirely sure why it's working, and at least it's interesting. To the best of my estimation, it's got something to do with the call stack, and where NodeJS identifies as the origin of the function call. I've got my server set up in an MVC pattern so my main app.js is in the root dir, and the function that's being called is in /controllers folder, and I've been trying to do relative paths from that folder -- I'm still not sure why absolute paths didn't work.
The call stack goes:
app.js:
app.use('/somepath', endpointRouter);
...then in endpointRouter.js:
router.get('/request/file', endpointController.getFile);
...then finally in endpointController.js:
const readCSV = filename => {
//the code I shared
}
exports.getFile = (req, res, next) => {
// code that calls readCSV(filename)
}
...and I believe that because Node views the chain as originating from app.js, it then treats all relative paths as relative to app.js, in my root folder. Basically when I switched to the super unintuitive single-dot-relative path: './data/CONTENT.csv', it worked with no issue.

Instrumenting files on the fly with Istanbul

I can instrument a file/folder and write it to disk like so:
$ istanbul instrument public --output public-coverage --embed-source true
however I am wondering if there is a way to instrument files on the fly and serve them to the browser without ever writing the instrumented files to disk. Something like this:
app.use(function(req,res,next){
const file = req.path; // whatever
const k = cp.spawn('istanbul', ['instrument']);
fs.createReadStream(file).pipe(k.stdin).pipe(res);
});
does anyone know if that's possible and how?
Actually it is.
Below is an example where I'd just intercept the request for a normal "main.js" file and return the "instrumented" version instead.
Just a proof of concept, without any error handling and only for a specific file, but I think you get the point.
Alternatively you can load up instanbul in your code "require("istanbul")" and perform the action without actual spawns
app.get("/main.js", (req, res, next) => {
const cmd = path.join(__dirname, "node_modules", ".bin", "istanbul");
const file = path.join(__dirname, "public/main.js");
const s = spawn(cmd, ["instrument", file, "--embed-source", "--no-compact", "--preserve-comments"]);
s.stdout.on("data", (data) => {
res.send(data);
});
});

Add All Routes in ./routes to Middleware Stack

Right now I am using app.use() and require() for each route in my routes directory to add them to the middleware stack (I am using Express).
app.use('/', require('./routes/index'));
app.use('/users', require('./routes/users'));
app.use('/post', require('./routes/post'));
app.use('/submitPost', require('./routes/submitPost'));
...
Instead of doing this manually for each file, I would like to use a for-loop to iterate through the route files in ./routes and add each file to the middleware stack. This is what I have, but it isn't working:
require('fs').readdir('/routes', function (err, files) {
if (!err) {
for (var i = 0; i < files.length; i++) {
var file = files[i].substr(files[i].lastIndexOf('.'));
app.use('/' + file, require('./routes/' + file));
}
}
});
Could someone help me correct this bit of code. On another note, are there any disadvantages to automatically adding all routes in ./routes to the middleware stack?
Thanks in advance.
The main issue here is probably when you are adding the middleware. You are using readdir - the asynchronous method. You likely have a catch-all 404 handler declared after your code, and as the routes you are requiring are added asynchronously, they will probably be added after the catch-all. When the request propagates through the middleware, this would terminate it before it even got to the route.
One other issue is the path you are using: /routes will attempt to look in the route of your filesystem. ./routes or __dirname + '/routes' is probaby what you want.
The following code sample works for me:
var files = require('fs').readdirSync('./routes')
for (var i = 0; i < files.length; i++) {
var file = files[i].substr(0, files[i].lastIndexOf('.'));
app.use('/' + file, require('./routes/' + file));
}
By the way, you can use file-manifest for this. It was actually created specifically for this use case, although it still expects you to call app.use yourself, since order matters for express routes.
So you can do something like:
var fm = require('file-manifest');
var routes = fm.generate('./routes');
app.use('/', routes.home);
app.use('/foo', routes.foo);
// etc.
If you really want it to all happen magically, you could make that work with a custom reduce function, but this is much more explicit and ensures that routes are set up in the right order (so you don't end up with /foo falling before /foo/bar and preventing it from being reached).
I believe I am supposed to qualify that I wrote this library.
There are a few ways to do this. Here's a clean implentation using the basic fs and path modules.
var fs = require("fs"),
path = require("path");
var root = "./routes/"
fs.readdir(root, function (err, files) {
if (err) {
throw err;
}
files.forEach(function (file) {
var filename = file.slice(0, -3);
var routePath = '/' + ((filename === 'index') ? '' : filename); //filter index to use just '/'
app.use(routepath, require(root + filename));
});
});

How to get list of all routes I am using in restify server

I have a app designed as follows;
//server.js =====================================================
var restify = require('restify'),
route1 = require('./routes/route1),
route2 = require('./routes/route2),
....
....
....
var server = restify.createServer({
name: 'xyz_server'
});
route1(server);
route2(server);
Now each route file looks like belwo
//route1.js =====================================================
module.exports = function(server) {
server.get('/someRoute',function(req,res,next){
//.. do something
});
server.get('/anotherRoute',function(req,res,next){
//..something else
});
};
Now the issue is tht we have dozen's of route files and hundreds of routes in total.
There are multiple developers working on this project and several routes are being added daily.
Is there a function in restify gives me a list of all routes in the system ?
What i am looking for is something like:
server.listAllRoutes();
Is anyone aware of this ?
Try something like this
function listAllRoutes(server){
console.log('GET paths:');
server.router.routes.GET.forEach(
function(value){console.log(value.spec.path);}
);
console.log('PUT paths:');
server.router.routes.PUT.forEach(
function(value){console.log(value.spec.path);}
);
}
listAllRoutes(server);
This should list all GET and PUT paths, adding POST and DELETE should be easy :)
2019 update: server.router.routes is no longer available instead we have server.router.getRoutes() which returns a Map. So we can log all the routes using:
function listAllRoutes(server) {
Object.values(server.router.getRoutes()).forEach(value =>
console.log(
`ENDPOINT REGISTERED :: ${value.method} :: ${server.url}${value.path}`
)
);
}
http://restify.com/docs/server-api/#server
There is a router.getRoutes() method, but it returns an object which is not the best to work with for listing things. You could fiddle around with that to turn it into an array with the shape that you like.
Alternatively, you can access all the routes as an array and then map them, even better if you use a lib like better-console to give you console.table in node. The following is working nicely for me in restify#8.3.0:
import console from 'better-console';
function listRoutes(server) {
const { routes } = server.router._registry._findMyWay; // beware these are probably meant to be private APIs, they could be subject to change
const mapped = routes.map(({ method, path }) => ({ method, path }));
console.table(mapped.sort((a, b) => a.method > b.method));
}

Node.JS - fs.exists not working?

I'm a beginner in Node.js, and was having trouble with this piece of code.
var fs = require('fs');
Framework.Router = function() {
this.run = function(req, res) {
fs.exists(global.info.controller_file, function(exists) {
if (exists) {
// Here's the problem
res.writeHead(200, {'Content-Type':'text/html'});
var cname = App.ucfirst(global.info.controller)+'Controller';
var c = require(global.info.controller_file);
var c = new App[cname]();
var action = global.info.action;
c[action].apply(global.info.action, global.info.params);
res.end();
} else {
App.notFound();
return false;
}
});
}
};
The problem lies in the part after checking if the 'global.info.controller_file' exists, I can't seem to get the code to work properly inside the: if (exists) { ... NOT WORKING }
I tried logging out the values for all the variables in that section, and they have their expected values, however the line: c[action].apply(global.info.action, global.info.params);
is not running as expected. It is supposed to call a function in the controller_file and is supposed to do a simple res.write('hello world');. I wasn't having this problem before I started checking for the file using fs.exists. Everything inside the if statement, worked perfectly fine before this check.
Why is the code not running as expected? Why does the request just time out?
Does it have something to do with the whole synchronous vs asynchronous thing? (Sorry, I'm a complete beginner)
Thank you
Like others have commented, I would suggest you rewrite your code to bring it more in-line with the Node.js design patterns, then see if your problem still exists. In the meantime, here's something which may help:
The advice about not using require dynamically at "run time" should be heeded, and calling fs.exists() on every request is tremendously wasteful. However, say you want to load all *.js files in a directory (perhaps a "controllers" directory). This is best accomplished using an index.js file.
For example, save the following as app/controllers/index.js
var fs = require('fs');
var files = fs.readdirSync(__dirname);
var dotJs = /\.js$/;
for (var i in files) {
if (files[i] !== 'index.js' && dotJs.test(files[i]))
exports[files[i].replace(dotJs, '')] = require('./' + files[i]);
}
Then, at the start of app/router.js, add:
var controllers = require('./controllers');
Now you can access the app/controllers/test.js module by using controllers.test. So, instead of:
fs.exists(controllerFile, function (exists) {
if (exists) {
...
}
});
simply:
if (controllers[controllerName]) {
...
}
This way you can retain the dynamic functionality you desire without unnecessary disk IO.

Resources