I google a lot for finding how to secure file uploading in express js,at end I develop following code to do it.
app.use(express.json());
app.use(express.urlencoded());
app.post('/',express.bodyParser({
keepExtensions: true,
uploadDir: __dirname + '/faxFiles',
limit: '20mb'
}),function(req,res){
checkFile(req.files.faxFile);
});
as you see I can limit file size and set uploadDir in bodyParser,now I need to allow user to upload image and pdf only,the way I used is checkFile function which contains following code.
var fs = require('fs');
var checkFile = function(faxFile){
if (faxFile.type != "image/jpeg" || faxFile.type != "application/pdf" || faxFile.type != "image/gif"){
fs.unlink(faxFile.path, function(err){
});
}
}
but I think it's not best way,is there any alternative way to do it?such as set file extension in bodyParser constructor?
You can use mmmagic for strictly checking the extensions. It is an async libmagic binding for node.js for detecting content types by data inspection.
Express uses formidible (https://github.com/felixge/node-formidable) for parsing form data, including file uploads.
I don't see an option in formidible to restrict file types, so I'm suggesting Express likely wouldn't have one either.
I created a little gist to show how to check the mime type using mmmagic while streaming the file:
https://gist.github.com/chmanie/8520572
This is more likely to function in a streaming environment like multiparty or busboy.
Related
I'm using Unity's WebGL and I'm getting this message on the console "You can reduce your startup time if you configure your web server to host .unityweb files using gzip compression." So according to Unity's documentation, I need to add the correct response Headers https://docs.unity3d.com/Manual/webgl-deploying.html.
I found the "express-static-gzip" module, and I tried to do just that, but the warning is still there. Below is the server.
const express = require('express');
const ip = require("ip");
const expressStaticGzip = require('express-static-gzip');
const http = require('http');
const app = express();
const server = http.Server(app);
app.use('/public/Builds/Build/', expressStaticGzip('public/Builds/Build/', {
customCompressions: [{
encodingName: "gzip",
fileExtension: "unityweb"
}]
}));
// app.use(compression());
app.use(express.static('public'));
server.listen(3000, function(){
console.log( ":: http://" + ip.address() + "/ ::" );
});
Any ideas?
Nick
Many thanks to #d_shiv for his help. I changed the code to the following, and the warning went away.
(you can change gzip with br if you're using brotli)
const express = require('express');
const ip = require("ip");
const http = require('http');
const app = express();
const server = http.Server(app);
app.use(express.static('public', {
setHeaders: function(res, path) {
if(path.endsWith(".unityweb")){
res.set("Content-Encoding", "gzip");
}
}
}));
server.listen(3000, function(){
console.log( ":: http://" + ip.address() + ":3000/ ::" );
});
express-static-gzip does not gzip the files on the fly before serving it. It assumes that you have the normal as well as gzipped versions of the file available on the specified directory. Check the Examples section of documentation here.
In this scenario, if the public/Builds/Build/Builds.wasm.framework.unityweb had to be transferred with gzip compression, you'd need to create a gzipped version by name of public/Builds/Build/Builds.wasm.framework.unityweb.gz. The middleware will automatically scan the folder for all such file pairs where original as well as gzipped versions are available. It will serve the gzipped version when request comes for original file, if the browser supports it.
The customCompressions array should also be skipped since that's enabled by default. The middleware would be registered, something like this:
app.use('/Builds/Build/', expressStaticGzip('public/Builds/Build/'));
Also note that public/ is removed from the middleware path (should be present in the expressStaticGzip path though). This is because your assets are being loaded from path https://{hostname}/Builds/Build/....
If you intend to compress the files on the fly and server it, take a look at compression module. The can be very costly operation for your server though, if possible do the gzipping during build time to create the equivalent .gz files, and continue to use express-static-gzip.
I have created a pdf with the browser in Javascript and sent it via post to the server using this code:
var blob = pdf.output('blob')
var xhr = new XMLHttpRequest();
xhr.open('post','/upload', true);
xhr.setRequestHeader("Content-Type", "application/pdf");
xhr.send(blob);
I would like to save as pdf on the server running Node with express. I have come up with the following code using express and body-parser package:
const bodyParser = require('body-parser');
app.use(bodyParser.urlencoded({ limit: '1gb', extended: false }));
app.use(bodyParser.raw({ limit: '1gb', type: 'application/pdf' }));
app.post('/upload', function(req, res){
console.log(req.body);
}
req.body is a Buffer, Uint8Array[653120]:
I need help converting it back to pdf before saving in on the server. Any help would be appreciated. Thanks.
A buffer is a literal binary representation. Just write it to a file directly without .toString() and it should be the file you want.
e.g. to try fs.writeFileSync('some.pdf', req.body)
I do not actually recommend using writeFileSync - instead use writeFile which is async and needs a callback, but won't block other http requests from being accepted.
A Buffer is just a sequence of bytes without any encoding. If you expect body to look like xml when you log it out, try .toString('utf8') on it. hex/utf8/base64 are just representations of binary. They're like a function to unpack, or pack data. In this case you want the sequence of bytes in your buffer to exist on disk as-they-are; so messing with the encoding is undesirable.
I want to convert some html page to pdf via wkhtmltopdf. However, the html page I want to convert to pdf is dynamically generated using handlebars.
So I think one solution maybe to generate the html page via handlebars but to a file (html file). Then, convert that file to pdf using hkhtmltopdf, then allow the user to, somehow, download the pdf.
So, my question is: how can I render the (handlebars) dynamically generated html page to a file?
Thanks and bye ...
Simple example for create file.
var Handlebars = require('handlebars');
var source = "<p>Hello, my name is {{name}}. I am from {{hometown}}. I have " +
"{{kids.length}} kids:</p>" +
"<ul>{{#kids}}<li>{{name}} is {{age}}</li>{{/kids}}</ul>";
var template = Handlebars.compile(source);
var data = { "name": "Alan", "hometown": "Somewhere, TX",
"kids": [{"name": "Jimmy", "age": "12"}, {"name": "Sally", "age": "4"}]};
var result = template(data);
var fs = require('fs');
fs.writeFile("test.html", result, function(err) {
if(err) {
return console.log(err);
}
});
Using express-handlebars, you should use the advanced mode and create an instance of it like in this example.
The proper way would be to create a view file (like you probably already have per you question) and use the express handlebars instance to render it:
// init code
var exphbs = require('express-handlebars');
var hbs = exphbs.create({
defaultLayout: 'your-layout-name',
helpers: require("path-to-your-helpers-if-any"),
});
app.engine('.file-extention-you-use', hbs.engine);
app.set('view engine', '.file-extention-you-use');
// ...then, in the router
hbs.render('full-path-to-view',conext, options).then(function(hbsTemplate){
// hbsTemplate contains the rendered html, do something with it...
});
HTH
Code above from Alex works perfect. However, my confusion was: I was using 'express-handlebars' and not 'handlebars'. Now, what I can understand is Express-Handlebars is an implementation of Handlebars for an Express application, which I´m using. I just didn't find a way to use the 'compile()' method in Express-Handlebars, so I ended up installing Handlebars (standalone) and used it to compile my (html) template and save the result to disk, just as Alex explained above.
In summary:
1) I know Express-Handlebars is Handlebars for Express app.
2) I don't know how to use "compile()" method just from express-handlebars, so I ended up installing Handlebars (from npm) and using it on the server to produce my html file (from template) and save it to disk.
3) Of course I installed and use Express-Handlebars everywhere to serve my pages in my Express app; just installed Handlebars to produce my html (in the server) with "compile()" method and save the result to disk.
Hope this is understandable. Thanks again and bye ...
Usually you render a Jade page in a route like this:
app.get('/page', function(req, res, next){
res.render('page.jade');
});
But I want to serve all Jade pages (automatically rendered), just like how one would serve static HTML
app.use(express.static('public'))
Is there a way to do something similar for Jade?
"static" means sending existing files unchanged directly from disk to the browser. Jade can be served this way but that is pretty unusual. Usually you want to render jade to HTML on the server which by definition is not "static", it's dynamic. You do it like this:
app.get('/home', function (req, res) {
res.render('home'); // will render home.jade and send the HTML
});
If you want to serve the jade itself for rendering in the browser, just reference it directly in the url when loading it into the browser like:
$.get('/index.jade', function (jade) {
//...
});
https://github.com/runk/connect-jade-static
Usage
Assume the following structure of your project:
/views
/partials
/file.jade
Let's make jade files from /views/partials web accessable:
var jadeStatic = require('connect-jade-static');
app = express();
app.configure(function() {
app.use(jadeStatic({
baseDir: path.join(__dirname, '/views/partials'),
baseUrl: '/partials',
jade: { pretty: true }
}));
});
Now, if you start your web server and request /views/partials/file.html in browser you
should be able see the compiled jade template.
Connect-jade-static is good, but not the perfect solution for me.
To begin with, here are the reasons why I needed jade:
My app is a single page app, there are no HTMLs generated from templates at runtime. Yet, I am using jade to generate HTML files because:
Mixins: lots of repeated / similar code in my HTML is shortened by the use of mixins
Dropdowns: I know, lots of people use ng-repeat to fill the options in a select box. This is a waste of CPU when the list is static, e.g., list of countries. The right thing to do is have the select options filled in within the HTML or partial. But then, a long list of options makes the HTML / jade hard to read. Also, very likely, the list of countries is already available elsewhere, and it doesn’t make sense to duplicate this list.
So, I decided to generate most of my HTML partials using jade at build time. But, this became a pain during development, because of the need to re-build HTMLs when the jade file changes. Yes, I could have used connect-jade-static, but I really don’t want to generate the HTMLs at run time — they are indeed static files.
So, this is what I did:
Added a 'use' before the usual use of express.static
Within this, I check for the timestamps of jade and the corresponding html file
If the jade file is newer, regenerate the html file
Call next() after the regeneration, or immediately, if regeneration is not required.
next() will fall-through to express.static, where the generated HTML will be served
Wrap the ‘use’ around a “if !production” condition, and in the build scripts, generate all the HTML files required.
This way, I can also use all the goodies express.static (like custom headers) provides and still use jade to generate these.
Some code snippets:
var express = require('express');
var fs = require('fs')
var jade = require('jade');
var urlutil = require('url');
var pathutil = require('path');
var countries = require('./countries.js');
var staticDir = 'static'; // really static files like .css and .js
var staticGenDir = 'static.gen'; // generated static files, like .html
var staticSrcDir = 'static.src'; // source for generated static files, .jade
if (process.argv[2] != 'prod') {
app.use(‘/static', function(req, res, next) {
var u = urlutil.parse(req.url);
if (pathutil.extname(u.pathname) == '.html') {
var basename = u.pathname.split('.')[0];
var htmlFile = staticGenDir + basename + '.html';
var jadeFile = staticSrcDir + basename + '.jade';
var hstat = fs.existsSync(htmlFile) ? fs.statSync(htmlFile) : null;
var jstat = fs.existsSync(jadeFile) ? fs.statSync(jadeFile) : null;
if ( jstat && (!hstat || (jstat.mtime.getTime() > hstat.mtime.getTime())) ) {
var out = jade.renderFile(jadeFile, {pretty: true, countries: countries});
fs.writeFile(htmlFile, out, function() {
next();
});
} else {
next();
}
} else {
next();
}
});
}
app.use('/static', express.static(staticDir)); // serve files from really static if exists
app.use('/static', express.static(staticGenDir)); // if not, look in generated static dir
In reality, I have a js file containing not just countries, but various other lists shared between node, javascript and jade.
Hope this helps someone looking for an alternative.
I am trying to build a server that can accept gzipped POST data with express. I think I could just write my own middleware to pipe the request stream to a zlib.createGunzip() stream. The question is, how can I achieve that, afterwards, the express.bodyParser() middleware is still able to parse my gunzipped POST data?
I tried to replace the original request stream methods by the ones of the zlib stream, but that just made the bodyParser return a "Bad Request" Error:
var express = require('express');
var app = express();
function gUnzip(req, res, next) {
var newReq;
if (req.headers['content-encoding'] === 'gzip') {
console.log("received gzipped body");
newReq = req.pipe(zlib.createGunzip());
Object.getOwnPropertyNames(newReq).forEach(function (p) {
req[p] = newReq[p];
});
}
next();
}
app.use(gUnzip);
app.use(express.bodyParser());
app.listen(8080);
Is there a way to make this work without rewriting the bodyParser() middleware within my own middleware?
EDIT:
This is the same question: Unzip POST body with node + express. But in the answer he just does in his own middleware what the express.bodyParser() should do, which is what I want to avoid. I am looking for a way to simply unzip the request data from the stream and then pass it to the bodyParser(), which expects a stream itself, as can be seen at http://www.senchalabs.org/connect/json.html.
compressed request bodies are generally not used because you can't negotiate content encodings between the client and server easily (there's another stackoverflow question about that i believe). most servers don't support compressed request bodies, and the only time you really need it is for APIs where the client will send large bodies.
body-parser, specifically raw-body, does not support it because the use-case is so minimal, though i've though about adding it. for now, you'll have to create your body-parser. fortunately, that's easy since you can just fork body-parser and leverage raw-body. the main code you'll add around https://github.com/expressjs/body-parser/blob/master/index.js#L80:
var zlib = require('zlib')
var stream
switch (req.headers['content-encoding'] || 'identity') {
case 'gzip':
stream = req.pipe(zlib.createGunzip())
break
case 'deflate':
stream = req.pipe(zlib.createInflate())
break
case 'identity':
break
default:
var err = new Error('encoding not supported')
err.status = 415
next(err)
return
}
getBody(stream || req, {
limit: '1mb',
// only check content-length if body is not encoded
length: !stream && req.headers['content-length'],
encoding: 'utf8'
}, function (err, buf) {
})
Have you tried using the built in compress middleware. It's documented in the expressjs reference documentation
app.use(express.compress());
Maybe you can find something useful here instead: Unzip POST body with node + express