Im currently working on a project that requires the content to be gZip-ed before it's sent back to the browser.
Im currently using a simple read stream and piping the data to the response of a request, but im not sure the best way to gZip content without blocking requests
The line that send the data is:
require('fs').createReadStream(self.staticPath + Request.url).pipe(Response);
See the following class is the static handler object:
(function(){
var StaticFeeder = function()
{
this.staticPath = process.cwd() + '/application/static';
this.contentTypes = require('./contenttypes')
}
StaticFeeder.prototype.handle = function(Request,Response,callback)
{
var self = this;
if(Request.url == '/')
{
return false;
}
if(Request.url.indexOf('../') > -1)
{
return false;
}
require('path').exists(this.staticPath + Request.url,function(isthere){
/*
* If no file exists, pass back to the main handler and return
* */
if(isthere === false)
{
callback(false);
return;
}
/*
* Get the extention if possible
* */
var ext = require('path').extname(Request.url).replace('.','')
/*
* Get the Content-Type
* */
var ctype = self.contentTypes[ext] !== undefined ? self.contentTypes[ext] : 'application/octet-stream';
/*
* Send the Content-Type
* */
Response.setHeader('Content-Type',ctype);
/*
* Create a readable stream and send the file
* */
require('fs').createReadStream(self.staticPath + Request.url).pipe(Response);
/*
* Tell the main handler we have delt with the response
* */
callback(true);
})
}
module.exports = new StaticFeeder();
})();
Can anyone help me get around this problem, i haven't a clue on how to tell the piping to compress with gZip.
Thanks
Actually, I have a blog post on just this thing. http://dhruvbird.blogspot.com/2011/03/node-and-proxydecorator-pattern.html
You will need to:
npm install compress -g
Before using it though.
The basic idea revolves around using pipes to add functionality.
However, for your use-case, you would be better off putting node.js behind nginx to do all the gzip'ing since node.js is a single process (actually not), and the gzip routines would take up your process' CPU.
You can just pipe it through a compression stream:
var fs = require('fs')
var zlib = require('zlib')
fs.createReadStream(file)
.pipe(zlib.createGzip())
.pipe(Response)
assumes the file is not compressed already and you have already set all the headers for the response.
Related
I am new to nodejs and am learning how express work. I found a package call send on npm. So I read the code, but I have a question where does this.res come from.
SendStream.prototype.isCachable = function isCachable () {
var statusCode = this.res.statusCode
return (statusCode >= 200 && statusCode < 300) || statusCode === 304
}
I read the constructor of stream where the send module inherits. I do not find this.res in that constructor. If you can give a few idea to find. It will help a lot.
It looks to me like this.res is only set when you call .pipe() here in the code:
SendStream.prototype.pipe = function pipe (res) {
// root path
var root = this._root
// references
this.res = res; // <=========== here's where it is set
// decode the path
var path = decode(this.path)
if (path === -1) {
this.error(400)
return res
}
....
}
And, since it appears that the purpose of this library is to stream data to an http response, it appears the way you use this library is always send(...).pipe(res).
Assume that I have a nodeJS module whose index.js is as below
module.exports = function() {
var map = {},
CSV_FILE = "./input.csv",
_loadCSV = function() {
var fs = require('fs'),
parse = require('csv-parse'),
rawCSVData = fs.readFileSync(CSV_FILE).toString(),
i,
item;
parse(rawCSVData, {columns: true}, function(err, data) {
for (i = 0; i < data.length; i++) {
item = data[i];
map[item.key] = item.value;
}
});
},
_init = function() {
_loadCSV();
};
// Init
_init();
// Public
return {
/**
* getValue
*/
getValue: function(key) {
return map[key];
}
};
};
Now, everything works fine if I test locally. However, when I install this module in another project I get below error.
fs.js:549 return binding.open(pathModule._makeLong(path), stringToFlags(flags), mode);
^
Error: ENOENT: no such file or directory, open 'input.csv' at Error (native)
Is it possible to include a static mapping file as part of a nodeJS module that is used in module initialization?
Your problem is this line CSV_FILE = "./input.csv". It works locally because the script you're executing (index.js) is in the same directory as the input.csv-file. However, when you install it as a dependency, the input.csv-file is actually somewhere in ./node_modules/your-module/input.csv, hence your new index.js can't see any ./input.csv-file since it's not located in the same directory as the calling script.
There are two ways to solve this, the first one being the smartest in my opinion.
Do not distribute the input.csv-file. This is a very bad approach to building modules, and you should rather change your code so that your module accepts a path to a .csv-file that it loads. However your module may need static data, but in those cases it's smarter to just convert it to a JavaScript Object and include it directly.
Simply change one line of code,
from CSV_FILE = "./input.csv"
to CSV_FILE = __dirname + "/input.csv"
See documentation for __dirname
I have the following code:
Meteor.methods({
saveFile: function(blob, name, path, encoding) {
var path = cleanPath(path), fs = __meteor_bootstrap__.require('fs'),
name = cleanName(name || 'file'), encoding = encoding || 'binary',
chroot = Meteor.chroot || 'public';
// Clean up the path. Remove any initial and final '/' -we prefix them-,
// any sort of attempt to go to the parent directory '..' and any empty directories in
// between '/////' - which may happen after removing '..'
path = chroot + (path ? '/' + path + '/' : '/');
// TODO Add file existance checks, etc...
fs.writeFile(path + name, blob, encoding, function(err) {
if (err) {
throw (new Meteor.Error(500, 'Failed to save file.', err));
} else {
console.log('The file ' + name + ' (' + encoding + ') was saved to ' + path);
}
});
function cleanPath(str) {
if (str) {
return str.replace(/\.\./g,'').replace(/\/+/g,'').
replace(/^\/+/,'').replace(/\/+$/,'');
}
}
function cleanName(str) {
return str.replace(/\.\./g,'').replace(/\//g,'');
}
}
});
Which I took from this project
https://gist.github.com/dariocravero/3922137
The code works fine, and it saves the file, however it repeats the call several time and each time it causes meteor to reset using windows version 0.5.4. The F12 console ends up looking like this: . The meteor console loops over the startup code each time the 503 happens and repeats the console logs in the saveFile function.
Furthermore in the target directory the image thumbnail keeps displaying and then display as broken, then a valid thumbnail again, as if the fs is writing it multiple times.
Here is the code that calls the function:
"click .savePhoto":function(e, template){
e.preventDefault();
var MAX_WIDTH = 400;
var MAX_HEIGHT = 300;
var id = e.srcElement.id;
var item = Session.get("employeeItem");
var file = template.find('input[name='+id+']').files[0];
// $(template).append("Loading...");
var dataURL = '/.bgimages/'+file.name;
Meteor.saveFile(file, file.name, "/.bgimages/", function(){
if(id=="goodPhoto"){
EmployeeCollection.update(item._id, { $set: { good_photo: dataURL }});
}else{
EmployeeCollection.update(item._id, { $set: { bad_photo: dataURL }});
}
// Update an image on the page with the data
$(template.find('img.'+id)).delay(1000).attr('src', dataURL);
});
},
What's causing the server to reset?
My guess would be that since Meteor has a built-in "automatic directories scanning in search for file changes", in order to implement auto relaunching of the application to newest code-base, the file you are creating is actually causing the server reset.
Meteor doesn't scan directories beginning with a dot (so called "hidden" directories) such as .git for example, so you could use this behaviour to your advantage by setting the path of your files to a .directory of your own.
You should also consider using writeFileSync insofar as Meteor methods are intended to run synchronously (inside node fibers) contrary to the usual node way of asynchronous calls, in this code it's no big deal but for example you couldn't use any Meteor mechanics inside the writeFile callback.
asynchronousCall(function(error,result){
if(error){
// handle error
}
else{
// do something with result
Collection.update(id,result);// error ! Meteor code must run inside fiber
}
});
var result=synchronousCall();
Collection.update(id,result);// good to go !
Of course there is a way to turn any asynchronous call inside a synchronous one using fibers/future, but that's beyond the point of this question : I recommend reading this EventedMind episode on node future to understand this specific area.
I'm using node.js and through the socket.io library I receive chunks of data that are actually jpeg images. These images are frames of a realtime video captured from a remote webcam. I'm forced to stream the video as jpeg frames. I'm looking for a way to convert on the fly these jpeg images in a video file (mpeg 4 or mjpeg file). Does node have a library that can do this? I already took a look at the Node-fluent-FFMPEG library but the only examples given were about conversions of jpeg files to a video and not a conversion on the fly from a stream of jpeg images. Or alternatively, does ffmpeg for windows support a stream of jpeg images as input?
FFMPEG supports streams as inputs, as stated in the docs.
You can add any number of inputs to an Ffmpeg command. An input can
be [...] a readable stream
So for instance it supports using
ffmpeg().input(fs.createReadStream('/path/to/input3.avi'));
which creates a Readable stream from the file at '/path/to/input3.avi'.
I don't know anything about FFMPEG, but you may pull your messages coming from socket.io (messages may be a Buffer already) and wrap it with your own implementation of Readable stream.
I think you should look at videofy
var exec = require("child_process").exec;
var escape = require("shell-escape");
var debug = require("debug")("videofy");
var mkdirp = require("mkdirp");
var uid = require("uid2");
/*
* Expose videofy
*/
module.exports = videofy;
/**
* Convert `input` file to `output` video with the given `opts`:
*
* - `rate` frame rate [10]
* - `encoders` the video codec format, default is libx264
*
* #param {String} input
* #param {String} output
* #return
* #api public
*/
function videofy(input, output, opts, fn) {
if (!input) throw new Error('input filename required');
if (!output) throw new Error('output filename required');
var FORMAT = '-%05d';
// options
if ('function' == typeof opts) {
fn = opts;
opts = {};
} else {
opts = opts || {};
}
opts.rate = opts.rate || 10;
opts.codec = opts.codec || 'libx264';
// tmpfile(s)
var id = uid(10);
var dir = 'tmp/' + id;
var tmp = dir + '/tmp' + FORMAT + '.jpg';
function gc(err) {
debug('remove %s', dir);
exec('rm -fr ' + dir);
fn(err);
}
debug('mkdirp -p %s', dir);
mkdirp(dir, function(error) {
if (error) return fn(error);
// convert gif to tmp jpg
var cmd = ['convert', input, tmp];
cmd = escape(cmd);
debug('exec %s', cmd);
// covert jpg collection to video
exec(cmd, function(err) {
if (err) return gc(err);
var cmd = ['ffmpeg'];
cmd.push('-f', 'image2');
cmd.push('-r', String(opts.rate));
cmd.push('-i', tmp);
cmd.push('-c:v', String(opts.codec));
cmd.push(output);
cmd = escape(cmd);
debug("exec %s", cmd);
exec(cmd, gc);
});
});
}
Using require("child_process") you can use ffmpeg, or there are probably npm modules to help with this. ffmpeg will allow you to first take a list of jpegs and convert that to a video, second you can add a list (or just one) jpegs to the beginning or end of videos.
Consider e.g. the following scenario: we give some entry point URL (something like https://our.server/customer-name/entry-point.js) to our customer, so that they're able to include our product on their page by simply writing
<script language="Javascript" src="https://our.server/customer-name/entry-point.js"/>
in the place they want to put our product (yes, I know, this is an ugly solution, but it is not something I could change).
So here we face the problem: our entry-point.js should somehow know from where (https://our.server/customer-name/) it should load the other files. So it seems that the answer is to generate entry-point.js dynamically so that it will contain e.g.
var ourcompany_ourproduct_basepath = "https://our.server/customer-name/";
The obvious way to do this is to construct an entry-point.js manually, something like this:
res.write("var ourprefix_basepath = \"" + basepath.escape() + "\";");
res.write("function ourprefix_entryPoint() { /*do something*/ }");
res.write("ourprefix_entryPoint();");
As you can see, it is just too bad.
Is there any template engine that will allow e.g. for the following:
var basepath = "https://our.server/customer-name/";
var export = {
ourprefix_basepath: basepath.escape(),
ourprefix_entrypoint: function() { /* do something */ }
};
templateEngine.render(export);
or
view.vw:
ourprefix_basepath = rewrite("{#basepath}");
function ourprefix_entrypoint() { /* do something */
ourprefix_entrypoint();
App.js:
templateEngine.render("view.vw", { basepath: "https://our.server/customer-name/" });
or something like this (you've got the idea), which will write the following to the response stream:
var ourprefix_basepath = "https://our.server/customer-name/";
function ourprefix_entrypoint() { /* do something */ };
ourprefix_entrypoint();
?
It seems that Node.js supports reflection, though i can't find if it is explicitly stated somewhere.
So, by exploiting the fact JSON is the subset of JS, the following cleaner code without using a template engine is possible:
var entrypoint = function(data) {
/* do something with data.basepath and data.otherparam here... */
}
exports.processRequest = function(request, response) {
var data = {
basepath: computeBasepath(request),
otherparam: "somevalue"
};
response.send("(" + entrypoint.toString() + ")(" + JSON.stringify(data) + ")");
}