How to use filesystem's createReadStream with Meteor router(NodeJS) - node.js

I need to allow the user of my app to download a file with Meteor. Currently what I do is when the user requests to download a file I enter into a "fileRequests" collection in Mongo a document with the file location and a timestamp of the request and return the ID of the newly created request. When the client gets the new ID it imediately goes to mydomain.com/uploads/:id. I then use something like this to intercept the request before Meteor does:
var connect = Npm.require("connect");
var Fiber = Npm.require("fibers");
var path = Npm.require('path');
var fs = Npm.require("fs");
var mime = Npm.require("mime");
__meteor_bootstrap__.app
.use(connect.query())
.use(connect.bodyParser()) //I add this for file-uploading
.use(function (req, res, next) {
Fiber(function() {
if(req.method == "GET") {
// get the id here, and stream the file using fs.createReadStream();
}
next();
}).run();
});
I check to make sure the file request was made less than 5 seconds ago, and I immediately delete the request document after I've queried it.
This works, and is secure(enough) I think. No one can make a request without being logged in and 5 seconds is a pretty small window for someone to be able to highjack the created request URL but I just don't feel right with my solution. It feels dirty!
So I attempted to use Meteor-Router to accomplish the same thing. That way I can check if they're logged in correctly without doing the 5 second open to the world trickery.
So here's the code I wrote for that:
Meteor.Router.add('/uploads/:id', function(id) {
var path = Npm.require('path');
var fs = Npm.require("fs");
var mime = Npm.require("mime");
var res = this.response;
var file = FileSystem.findOne({ _id: id });
if(typeof file !== "undefined") {
var filename = path.basename(file.filePath);
var filePath = '/var/MeteorDMS/uploads/' + filename;
var stat = fs.statSync(filePath);
res.setHeader('Content-Disposition', 'attachment; filename=' + filename);
res.setHeader('Content-Type', mime.lookup(filePath));
res.setHeader('Content-Length', stat.size);
var filestream = fs.createReadStream(filePath);
filestream.pipe(res);
return;
}
});
This looks great, fits right in with the rest of the code and is easy to read, no hacking involved, BUT! It doesn't work! The browser spins and spins and never quite knows what to do. I have ZERO error messages coming up. I can keep using the app on other tabs. I don't know what it's doing, it never stops "loading". If I restart the server, I get a 0 byte file with all the correct headers, but I don't get the data.
Any help is greatly appreciated!!
EDIT:
After digging around a bit more, I noticed that trying to turn the response object into a JSON object results in a circular structure error.
Now the interesting thing about this is that when I listen to the filestream for the "data" event, and attempt to stringify the response object I don't get that error. But if I attempt to do the same thing in my first solution(listen to "data" and stringify the response) I get the error again.
So using the Meteor-Router solution something is happening to the response object. I also noticed that on the "data" event response.finished is flagged as true.
filestream.on('data', function(data) {
fs.writeFile('/var/MeteorDMS/afterData', JSON.stringify(res));
});

The Meteor router installs a middleware to do the routing. All Connect middleware either MUST call next() (exactly once) to indicate that the response is not yet settled or MUST settle the response by calling res.end() or by piping to the response. It is not allowed to do both.
I studied the source code of the middleware (see below). We see that we can return false to tell the middleware to call next(). This means we declare that this route did not settle the response and we would like to let other middleware do their work.
Or we can return a template name, a text, an array [status, text] or an array [status, headers, text], and the middleware will settle the response on our behalf by calling res.end() using the data we returned.
However, by piping to the response, we already settled the response. The Meteor router should not call next() nor res.end().
We solved the problem by forking the Meteor router and making a small change. We replaced the else in line 87 (after if (output === false)) by:
else if (typeof(output)!="undefined") {
See the commit with sha 8d8fc23d9c in my fork.
This way return; in the route method will tell the router to do nothing. Of course you already settled the response by piping to it.
Source code of the middleware as in the commit with sha f910a090ae:
// hook up the serving
__meteor_bootstrap__.app
.use(connect.query()) // <- XXX: we can probably assume accounts did this
.use(this._config.requestParser(this._config.bodyParser))
.use(function(req, res, next) {
// need to wrap in a fiber in case they do something async
// (e.g. in the database)
if(typeof(Fiber)=="undefined") Fiber = Npm.require('fibers');
Fiber(function() {
var output = Meteor.Router.match(req, res);
if (output === false) {
return next();
} else {
// parse out the various type of response we can have
// array can be
// [content], [status, content], [status, headers, content]
if (_.isArray(output)) {
// copy the array so we aren't actually modifying it!
output = output.slice(0);
if (output.length === 3) {
var headers = output.splice(1, 1)[0];
_.each(headers, function(value, key) {
res.setHeader(key, value);
});
}
if (output.length === 2) {
res.statusCode = output.shift();
}
output = output[0];
}
if (_.isNumber(output)) {
res.statusCode = output;
output = '';
}
return res.end(output);
}
}).run();
});

Related

.push() isn't saving to my array - NodeJS

I'm trying to code a Webserver in plain NodeJS with http. What i'm trying to do is when you have the url /add/ whatever is after the /add/ would be added to a array, and showen on / but its not saving the array, and resets it when i add something new. Can someone explain what i did wrong?
Sorry that my code is a bit messy.
Index.js:
const http = require('http');
const $conf = require('./config.json');
http.createServer((req, res) => {
let $list = [];
const url = req.url;
const $req = url.split('/');
const $req_q = url.split('?');
if (req.url == "/"){
res.write($list.toString());
} else if (req.url == `/add/${$req[2]}`){
$list.push($req[2])
console.log($list)
res.write(`ADDED ${$req[2]}`)
} else {
res.write('Error');
}
res.end();
}).listen($conf.port);
Config.json:
{
"port": 80
}
When i goto /add/hi
i get "ADDED hi".
Then go back to /
and get nothing.
A NodeJS webserver executes a route (e.g '/' or '/add') on-demand. That means that everything inside the function is executed from scratch on every request.
Therefore your $list variable gets initialized everytime you call a route and you can't see the result on the screen.
You can fix it by moving the variable to the global scope.
const http = require('http');
const $conf = require('./config.json');
let $list = []; // Now this value is persisted between requests
http.createServer((req, res) => {
const url = req.url;
const $req = url.split('/');
const $req_q = url.split('?');
if (req.url == "/"){
res.write($list.toString());
} else if (req.url == `/add/${$req[2]}`){
$list.push($req[2])
console.log($list)
res.write(`ADDED ${$req[2]}`)
} else {
res.write('Error');
}
res.end();
}).listen($conf.port);
The main reason is the $list array is initialized every time there is a new request. So, there can be two ways of doing this at the moment.
Create a variable outside this route scope so you can access the previous value. (this variable will be shared for every request/client)
Another and more efficient way is you can store the value more efficiently using node-json-db, as there might be thousands of request for which you might need to save this data and it won't be efficient at that time. You can use node-json-db.
Sample code:
import { JsonDB } from 'node-json-db';
import { Config } from 'node-json-db/dist/lib/JsonDBConfig'
// The first argument is the database filename. If no extension, '.json' is assumed and automatically added.
// The second argument is used to tell the DB to save after each push
// If you put false, you'll have to call the save() method.
// The third argument is to ask JsonDB to save the database in an human readable format. (default false)
// The last argument is the separator. By default it's slash (/)
var db = new JsonDB(new Config("myDataBase", true, false, '/'));
// Pushing the data into the database
// With the wanted DataPath
// By default the push will override the old value
db.push("/test1","super test");
// Get the data from the root
var data = db.getData("/");
// From a particular DataPath
var data = db.getData("/test1");
// If you try to get some data from a DataPath that doesn't exists
// You'll get an Error
try {
var data = db.getData("/test1/test/dont/work");
} catch(error) {
// The error will tell you where the DataPath stopped. In this case test1
// Since /test1/test does't exist.
console.error(error);
};
There are several other operations you can perform for that you should check its documentation.

Creating promises in loop

I have to create promises in loop according to given config file and return response when all are resolved. Here goes the code-
{for(let type in spotlight){
switch (type){
case "outliers":{
let ops= spotlight[type];
for(let i=0;i<ops.length;i++){
(function(op){
let p= new Promise(function(resolve,reject){
let reqUrl= urlCreator(op.uri,op.query);
//console.log("--------------------"+reqUrl);
apiService.get(reqUrl,function(isSuccess,data){
if(!isSuccess){
return reject(data);
}
// console.log(isSuccess);
// console.log(data);
// console.log("trend is ------"+JSON.stringify(op));
// create objects array
// let temp= [];
// let overallScore= data.overall.score;
// for(let day in overallScore){
// temp.push({"key": day,"value": parseFloat(overallScore[day])});
// }
//let outliers= stats.outliers(temp,"key","value");
resolve({"type":type,"name": op.name,"data": outliers});
})
});
promiseArray.push(p);
}(ops[i]))
}
break;
}
case "filters":{
let ops= spotlight[type];
for(let i=0;i<ops.length;i++){
(function(op){
let p= new Promise(function(resolve,reject){
let reqUrl= urlCreator(op.uri,op.query);
apiService.get(reqUrl,function(isSuccess,data){
if(!isSuccess){
return reject(data);
}
// console.log(isSuccess);
// console.log(data);
// console.log("coc is ------"+JSON.stringify(op));
resolve({"type": type,"name": op.name,"data": data});
})
})
promiseArray.push(p);
}(ops[i]))
}
break;
}
}
}
Promise.all(promiseArray).then(values=>{
return res.json(values);
},
reason=>{
return res.json(reason);
}).catch(reason=>{
return res.json(reason);
})}
Problem is that promises never return, neither resolved, nor rejected. According to the config file, it has to hit two URLs, say u1 and u2. I tried to log the output to see which requests are returning. When the server is started and very first req is made, U1 returns and req hangs. on refresh I get response from U2,U2 and request hangs, then on refresh again U1,U1 and this continues. It seems to me that for some reason only one request is returned and other sits in buffer or something and come when next request is made. Both requests are being made to the local server only, I am routing it externally just to make use of cache as url is being used as key for cache.
I tried using dummy urls like facebook.com and google.com, and it works perfectly fine.Using one local url and another like facebook.com also works, but when both urls are of local server, it gets stuck.
Does it has any thing to do with single threaded nature of node or due to using same socket for making both requests.
PS- I am using npm-request to make URL calls.
Perhaps hesitating before making the second request would solve your problem.
I've made some tools that could help with that. See the MacroQTools.js file at
https://github.com/J-Adrian-Zimmer/JavascriptPromisesClarified.git
You're defining the request callback as function(success , data), while request consumes error-first callbacks, defined like function(error , response).
You're calling request like:
apiService.get(reqUrl,function(isSuccess,data){
if(!isSuccess){
return reject(data);
}
// console.log(isSuccess);
// console.log(data);
// console.log("coc is ------"+JSON.stringify(op));
resolve({"type": type,"name": op.name,"data": data});
});
Pretending that, if the first parameter misses, you have to reject it with the second parameter, data. While, really, it would something like:
apiService.get(reqUrl,function(err,data){
if(err){
reject(err);
}
else{
// console.log(isSuccess);
// console.log(data);
// console.log("coc is ------"+JSON.stringify(op));
resolve({"type": type,"name": op.name,"data": data});
}
});
Since request expects error-first callbacks (like almost anything in node that takes a callback).
So, when the requests actually work as expected, your code must be actually rejecting the promises with the actual real value, since when the request works, isSuccess is null and data has the real response value.
This surely is breaking something and is not good, while just fixing it maybe doesn't solve your issue completely: I believe your requests are acting weird because some configuration problem of your api, not just because you're rejecting promises when requests are successful (that would just send the data as the rejection reason).
Also you're handling the rejection of Promise.all() twice, passing a second handler to then and calling catch again. Only one is needed, and the .catch(handler) is probably better.
I made a small working example on how you can use Promise.all to collect async requests. I used imdb as the apiService, but any async http service would work too. I didn't reproduce totally from your code, but I'm sure you can adapt this to make your code work, at least the part of the code that is just consuming http services.
var express = require('express');
var app = express();
var Promise = require('bluebird');
var imdb = require('imdb-api');
app.get('/', controllerHandler );
app.listen(3000, function () {
console.log('Example app listening on port 3000!')
});
var apiService = {}
apiService.get = imdb.getReq;
function controllerHandler(request , response){
//like iterating through spotlight.type and returning an array of promises from it.
//in this case the array is from films and Airbag is obviously the best of them
var promises = [{name : 'The Matrix'} , { name : 'Avatar'} , {name : 'Airbag'}].map( createPromise );
//use either .catch(errorHandler) or then( successHandler , errorHandler ). The former is the better:
Promise.all(promises).then( successHandler ).catch( errorHandler );
function successHandler(result){
return response.json(result);
}
function errorHandler(reason){
console.log('There was an error calling to the service:');
console.log(reason);
return response.send('there was an error');
}
}
function createPromise(film){
return new Promise( function(resolve , reject){
apiService.get(film , function(err , data){
if(err)
reject( new Error(err));
else
resolve( {title : data.title , year : data.year} );
});
});
};

How to include output from ExpressJS res.render in a ZIP file?

I have a built a method in ExpressJS that exports a document as an HTML page:
html: function (req, res) {
Project.findOne( { req.params.project },
function (err, project) {
res.contentType('text/html');
res.render('exporting/html', { project.name });
}
);
},
Additionally, I'd like to create a method that includes that generated HTML page, together with some static assets, in a ZIP archive.
Here's my current code:
zip: function (req, res) {
Project.findOne( { req.params.project },
function (err, project) {
res.contentType('application/zip');
res.setHeader('content-disposition', 'attachment; filename=' + project.name + '.zip');
var zip = new AdmZip();
zip.addFile("readme.txt", new Buffer("This was inside the ZIP!"));
//------ Here I'd like to use zip.addFile to include the HTML output from the html method above ------
res.send(zip.toBuffer());
}
);
}
How can I make the zip method include the output from the html method?
You have two options: one is relatively simple and the other is a bit more complicated. You'll have to decide which you believe is which. ;)
First method
Since you are relying on express' Response.render to create your HTML from a view, you'll need to call that route on your server to retrieve the content of the page so you can include it in your zip response.
Assuming you have var http=require('http'); somewhere in this file, you can:
zip: function (req, res) {
var projectId=req.params.project||'';
if(!projectId){ // make sure we have what we need!
return res.status(404).send('requires a projectId');
}
// Ok, we start by requesting our project...
Project.findOne({id:projectId},function(err, project) {
if(err) { // ALWAYS handle errors!
return res.status(500).send(err);
}
if('object'!==typeof project){ // did we get what we expected?
return res.status(404).send('failed to find project for id: '+projectId);
}
var projectName=project.name || ''; // Make no assumptions!
if(!projectName){
return res.status(500).send('whoops! project has no name!');
}
// For clarity, let's write a function to create our
// zip archive which will take:
// 1. a name for the zip file as it is sent to the requester
// 2. an array of {name:'foo.txt',content:''} objs, and
// 3. a callback which will send the result back to our
// original requester.
var createZipArchive=function(name, files, cb){
// create our zip...
var zip=new AdmZip();
// add the files to the zip
if(Array.isArray(files)){
files.forEach(function(file){
zip.addFile(file.name,new Buffer(file.content));
});
}
// pass the filename and the new zip to the callback
return cb(name, zip);
};
// And the callback that will send our response...
//
// Note that `res` as used here is the original response
// object that was handed to our `zip` route handler.
var sendResult=function(name, zip){
res.contentType('application/zip');
res.setHeader('content-disposition','attachment; filename=' + name);
return res.send(zip.toBuffer());
};
// Ok, before we formulate our response, we'll need to get the
// html content from ourselves which we can do by making
// a get request with the proper url.
//
// Assuming this server is running on localhost:80, we can
// use this url. If this is not the case, modify it as needed.
var url='http://localhost:80/html';
var httpGetRequest = http.get(url,function(getRes){
var body=''; // we'll build up the result from our request here.
// The 'data' event is fired each time the "remote" server
// returns a part of its response. Remember that these data
// can come in multiple chunks, and you do not know how many,
// so let's collect them all into our body var.
getRes.on('data',function(chunk){
body+=chunk.toString(); // make sure it's not a Buffer!
});
// The 'end' event will be fired when there are no more data
// to be read from the response so it's here we can respond
// to our original request.
getRes.on('end',function(){
var filename=projectName+'.zip',
files=[
{
name:'readme.txt',
content:'This was inside the ZIP!'
},{
name:'result.html',
content:body
}
];
// Finally, call our zip creator passing our result sender...
//
// Note that we could have both built the zip and sent the
// result here, but using the handlers we defined above
// makes the code a little cleaner and easier to understand.
//
// It might have been nicer to create handlers for all the
// functions herein defined in-line...
return createZipArchive(filename,files,sendResult);
});
}).on('error',function(err){
// This handler will be called if the http.get request fails
// in which case, we simply respond with a server error.
return res.status(500).send('could not retrieve html: '+err);
});
);
}
This is really the best way to solve your problem, even though it might seem complex. Some of the complexity can be reduced by using a better
HTTP client library like superagent which reduce all the event handling rig-a-ma-roll to a simple:
var request = require('superagent');
request.get(url, function(err, res){
...
var zip=new AdmZip();
zip.addFile('filename',new Buffer(res.text));
...
});
Second method
The second method utilizes the render() method of express' app object, which is exactly what res.render() uses to convert views into HTML.
See Express app.render() for how this function operates.
Note that this solution is the same except for the portion annotated starting at // - NEW CODE HERE -.
zip: function (req, res) {
var projectId=req.params.project||'';
if(!projectId){ // make sure we have what we need!
return res.status(404).send('requires a projectId');
}
// Ok, we start by requesting our project...
Project.findOne({id:projectId},function(err, project) {
if(err) { // ALWAYS handle errors!
return res.status(500).send(err);
}
if('object'!==typeof project){ // did we get what we expected?
return res.status(404).send('failed to find project for id: '+projectId);
}
var projectName=project.name || ''; // Make no assumptions!
if(!projectName){
return res.status(500).send('whoops! project has no name!');
}
// For clarity, let's write a function to create our
// zip archive which will take:
// 1. a name for the zip file as it is sent to the requester
// 2. an array of {name:'foo.txt',content:''} objs, and
// 3. a callback which will send the result back to our
// original requester.
var createZipArchive=function(name, files, cb){
// create our zip...
var zip=new AdmZip();
// add the files to the zip
if(Array.isArray(files)){
files.forEach(function(file){
zip.addFile(file.name,new Buffer(file.content));
});
}
// pass the filename and the new zip to the callback
return cb(name, zip);
};
// And the callback that will send our response...
//
// Note that `res` as used here is the original response
// object that was handed to our `zip` route handler.
var sendResult=function(name, zip){
res.contentType('application/zip');
res.setHeader('content-disposition','attachment; filename=' + name);
return res.send(zip.toBuffer());
};
// - NEW CODE HERE -
// Render our view, build our zip and send our response...
app.render('exporting/html', { name:projectName }, function(err,html){
if(err){
return res.status(500).send('failed to render view: '+err);
}
var filename=projectName+'.zip',
files=[
{
name:'readme.txt',
content:'This was inside the ZIP!'
},{
name:'result.html',
content:html
}
];
// Finally, call our zip creator passing our result sender...
//
// Note that we could have both built the zip and sent the
// result here, but using the handlers we defined above
// makes the code a little cleaner and easier to understand.
//
// It might have been nicer to create handlers for all the
// functions herein defined in-line...
return createZipArchive(filename,files,sendResult);
});
}
While this method is somewhat shorter, by utilizing the underlying mechanism that Express uses to render views, it "couples" your zip route to the Express engine in such a way that, should the Express API change in the future, you'll need to make two changes to your server code (to properly handle the html route and the zip routes), rather than only one using the previous solution.
Personally, I favor the first solution as it is cleaner (in my mind) and more independent of unexpected change. But as they say YMMV ;).

What is the correct way to "hook" into app.response functions in express?

I need to way to add some data to app.json and app.jsonp responses in express after app.json is called, what is the correct way to add middleware to this? I know I could probably do something like:
var jsonTemp = function(status, data) {
if (!data) {
data = status;
status = 200;
}
data.extraValue = 'foo';
res.json(status, data);
}
res.json = jsonTemp;
but monkey patching like that seems like a "bad idea". Is there an official way to hook into the response with some kind of middleware?
I think that monkey patching actually might be the best solution without knowing more about what problem you are trying to solve. The code you showed does however crash the server so don't use that.
app.use(function (req, res, next) {
var orig = res.json;
res.json = function json(status, data) {
if (!data) {
data = status;
status = 200;
}
data.extraValue = 'foo';
orig.call(this, status, data);
};
next();
});
Here I'm storing the original function in the variable orig and then delegating to that one when I'm done with the modification. Your code called your modified function over and over again, since that one now lives under res.json.
You can register your own template engine to format your output. Maybe that helps.

Node.js thumbnailer using Imagemagick: nondeterministic corruption

I have a Node.js server which dynamically generates and serves small (200x200) thumbnails from images (640x640) in a database (mongodb). I'm using the node-imagemagick module for thumbnailing.
My code works roughly 95% of the time; about 1 in 20 (or fewer) thumbnailed images are corrupt on the client (iOS), which reports:
JPEG Corrupt JPEG data: premature end of data segment
For the corrupt images, the client displays the top 50% - 75% of the image, and the remainder is truncated.
The behavior is non-deterministic and the specific images which are corrupt changes on a per-request basis.
I'm using the following code to resize the image and output the thumbnail:
im.resize({
srcData: image.imageData.buffer,
width: opt_width,
}, function(err, stdout) {
var responseHeaders = {};
responseHeaders['content-type'] = 'image/jpeg';
responseHeaders['content-length'] = stdout.length;
debug('Writing ', stdout.length, ' bytes.');
response.writeHead(200, responseHeaders);
response.write(stdout, 'binary');
response.end();
});
What could be wrong, here?
Notes:
The problem is not an incorrect content-length header. When I omit the header, the result is the same.
When I do not resize the image, the full-size image always seems to be fine.
In researching this I found this and this StackOverflow questions, which both solved the problem by increasing the buffer size. In my case the images are very small, so this seems unlikely to be responsible.
I was originally assigning stdout to a new Buffer(stdout, 'binary') and writing that. Removing it ('binary' will be deprecated) made no difference.
The problem seems to have been due to a slightly older version of node-imagemagick (0.1.2); upgrading to 0.1.3 was the solution.
In case this is helpful to anyone, here's the code I used to make Node.js queue up and handle client requests one at a time.
// Set up your server like normal.
http.createServer(handleRequest);
// ...
var requestQueue = [];
var isHandlingRequest = false; // Prevent new requests from being handled.
// If you have any endpoints that don't always call response.end(), add them here.
var urlsToHandleConcurrently = {
'/someCometStyleThingy': true
};
function handleRequest(req, res) {
if (req.url in urlsToHandleConcurrently) {
handleQueuedRequest(req, res);
return;
}
requestQueue.push([req, res]); // Enqueue new requests.
processRequestQueue(); // Check if a request in the queue can be handled.
}
function processRequestQueue() {
// Continue if no requests are being processed and the queue is not empty.
if (isHandlingRequest) return;
if (requestQueue.length == 0) return;
var op = requestQueue.shift();
var req = op[0], res = op[1];
// Wrap .end() on the http.ServerRequest instance to
// unblock and process the next queued item.
res.oldEnd = res.end;
res.end = function(data) {
res.oldEnd(data);
isHandlingRequest = false;
processRequestQueue();
};
// Start handling the request, while blocking the queue until res.end() is called.
isHandlingRequest = true;
handleQueuedRequest(req, res);
}
function handleQueuedRequest(req, res) {
// Your regular request handling code here...
}

Resources