I have created a pdf with the browser in Javascript and sent it via post to the server using this code:
var blob = pdf.output('blob')
var xhr = new XMLHttpRequest();
xhr.open('post','/upload', true);
xhr.setRequestHeader("Content-Type", "application/pdf");
xhr.send(blob);
I would like to save as pdf on the server running Node with express. I have come up with the following code using express and body-parser package:
const bodyParser = require('body-parser');
app.use(bodyParser.urlencoded({ limit: '1gb', extended: false }));
app.use(bodyParser.raw({ limit: '1gb', type: 'application/pdf' }));
app.post('/upload', function(req, res){
console.log(req.body);
}
req.body is a Buffer, Uint8Array[653120]:
I need help converting it back to pdf before saving in on the server. Any help would be appreciated. Thanks.
A buffer is a literal binary representation. Just write it to a file directly without .toString() and it should be the file you want.
e.g. to try fs.writeFileSync('some.pdf', req.body)
I do not actually recommend using writeFileSync - instead use writeFile which is async and needs a callback, but won't block other http requests from being accepted.
A Buffer is just a sequence of bytes without any encoding. If you expect body to look like xml when you log it out, try .toString('utf8') on it. hex/utf8/base64 are just representations of binary. They're like a function to unpack, or pack data. In this case you want the sequence of bytes in your buffer to exist on disk as-they-are; so messing with the encoding is undesirable.
Related
I've set up a small node js BE app, built with express and fastCsv module on top of it. The desired outcome would be to be able to download a csv file to the client side, without storing it anywhere inside the server, since the data is generated depending on user criteria.
So far I've been able to get somewhere it it, Im using streams, since that csv file could be pretty large depending on the user selection. Im pretty sure something is missing inside the code bellow:
const fs = require('fs');
const fastCsv = require('fast-csv');
.....
(inside api request)
.....
router.get('/', async(req, res) => {
const gatheredData ...
const filename = 'sometest.csv'
res.writeHead(200, {
'Content-Type': 'text/csv',
'Content-Disposition': 'attachment; filename=' + filename
})
const csvDataStream = fastCsv.write(data, {headers: true}).pipe(res)
})
The above code 'works' in some way as it does deliver back the response, but not the actual file, but the contents of the csv file, which I can view in the preview tab as a response. To sum up, Im trying to stream in that data, into a csv and push it to download file to client, and not store it on the server. Any tips or pointers are very much appreciated.
Here's what worked for me after created a CSV file on the server using the fast-csv package. You need to specify the full, absolute directory path where the output CSV file was created:
const csv = require("fast-csv");
const csvDir = "abs/path/to/csv/dir";
const filename = "my-data.csv";
const csvOutput = `${csvDir}/${filename}`;
console.log(`csvOutput: ${csvOutput}`); // full path
/*
CREATE YOUR csvOutput FILE USING 'fast-csv' HERE
*/
res.type("text/csv");
res.header("Content-Disposition", `attachment; filename="${filename}"`);
res.header("Content-Type", "text/csv");
res.sendFile(filename, { root: csvDir });
You need to make sure to change the response content-type and headers to "text/csv", and try enclosing the filename=... part in double-quotes, like in the above example.
For exemple this is my server with a simple API :
var express = require('express');
var rzServer = express();
rzServer.use(bodyParser.urlencoded({extended:true}));
rzServer.use(bodyParser.json());
app.get('/url', function(req, res) {
console.log(req.query.data); // String
console.log(JSON.parse(req.query.date)); // Object
});
req.query.data is interpreted as a string but it's a JSON Object.
Is it possible with the body-parser package to parse the querystring ?
Thanks.
body-parser is a middleware to parse body (it's its name). If you want to parse the query string, so you need another middleware for that.
Another thing : GET requests normally don't take any JSON parameters (no body). If you need to send a true JSON, perhaps you're not using the good HTTP method. Try to use a POST request, or create a true query string (http://expressjs.com/fr/api.html#req.query).
I am using node express to process POST requests of heroku logging data with body data that is in the application/logplex-1 format (apparently syslog formatted).
In particular, I am using the body-parser module as middleware to parse the POST body.
It works OK to specify app.use(bodyParser.text({ type: 'application/logplex-1' })) to force body-parser to parse the body as text, but the text is just a big block of space-separated information without much structure other than that. Therefore I need to parse the body data further to find and extract what I want.
This is OK, but I'm wondering if there is, perhaps, a better way of parsing the logplex-1 body more directly into something more structured and easier to work with, like JSON. I'm not familiar with logplex-1 or the syslog format, and whether it does indeed have anything more useful structure/metadata in it than is apparent from the text block I'm currently getting.
Any ideas?
I have no experience with logplex or Heroku, but this seems to be working:
var syslogParser = require('glossy').Parse;
var express = require('express');
var app = express();
var server = app.listen(3012);
// Express allows arrays-of-middleware to act as a "single" middleware.
var logplexMiddleware = [
// First, read the message body into `req.body`, making sure it only
// accepts logplex "documents".
require('body-parser').text({ type: 'application/logplex-1' }),
// Next, split `req.body` into separate lines and parse each one using
// the `glossy` syslog parser.
function(req, res, next) {
req.body = (req.body || '').split(/\r*\n/).filter(function(line) {
// Make sure we only parse lines that aren't empty.
return line.length !== 0;
}).map(function(line) {
// glossy doesn't like octet counts to be prepended to the log lines,
// so remove those.
return syslogParser.parse(line.replace(/^\d+\s+/, ''));
});
next();
}
];
// Example endpoint:
app.post('/', logplexMiddleware, function(req, res) {
console.log(req.body);
return res.sendStatus(200);
});
It uses glossy to parse the syslog messages into Javascript objects.
If the amount of data being posted is considerable (>hundreds of K's), it might be better to implement a streaming solution as the code above will first read the entire message body into memory.
I google a lot for finding how to secure file uploading in express js,at end I develop following code to do it.
app.use(express.json());
app.use(express.urlencoded());
app.post('/',express.bodyParser({
keepExtensions: true,
uploadDir: __dirname + '/faxFiles',
limit: '20mb'
}),function(req,res){
checkFile(req.files.faxFile);
});
as you see I can limit file size and set uploadDir in bodyParser,now I need to allow user to upload image and pdf only,the way I used is checkFile function which contains following code.
var fs = require('fs');
var checkFile = function(faxFile){
if (faxFile.type != "image/jpeg" || faxFile.type != "application/pdf" || faxFile.type != "image/gif"){
fs.unlink(faxFile.path, function(err){
});
}
}
but I think it's not best way,is there any alternative way to do it?such as set file extension in bodyParser constructor?
You can use mmmagic for strictly checking the extensions. It is an async libmagic binding for node.js for detecting content types by data inspection.
Express uses formidible (https://github.com/felixge/node-formidable) for parsing form data, including file uploads.
I don't see an option in formidible to restrict file types, so I'm suggesting Express likely wouldn't have one either.
I created a little gist to show how to check the mime type using mmmagic while streaming the file:
https://gist.github.com/chmanie/8520572
This is more likely to function in a streaming environment like multiparty or busboy.
I am trying to build a server that can accept gzipped POST data with express. I think I could just write my own middleware to pipe the request stream to a zlib.createGunzip() stream. The question is, how can I achieve that, afterwards, the express.bodyParser() middleware is still able to parse my gunzipped POST data?
I tried to replace the original request stream methods by the ones of the zlib stream, but that just made the bodyParser return a "Bad Request" Error:
var express = require('express');
var app = express();
function gUnzip(req, res, next) {
var newReq;
if (req.headers['content-encoding'] === 'gzip') {
console.log("received gzipped body");
newReq = req.pipe(zlib.createGunzip());
Object.getOwnPropertyNames(newReq).forEach(function (p) {
req[p] = newReq[p];
});
}
next();
}
app.use(gUnzip);
app.use(express.bodyParser());
app.listen(8080);
Is there a way to make this work without rewriting the bodyParser() middleware within my own middleware?
EDIT:
This is the same question: Unzip POST body with node + express. But in the answer he just does in his own middleware what the express.bodyParser() should do, which is what I want to avoid. I am looking for a way to simply unzip the request data from the stream and then pass it to the bodyParser(), which expects a stream itself, as can be seen at http://www.senchalabs.org/connect/json.html.
compressed request bodies are generally not used because you can't negotiate content encodings between the client and server easily (there's another stackoverflow question about that i believe). most servers don't support compressed request bodies, and the only time you really need it is for APIs where the client will send large bodies.
body-parser, specifically raw-body, does not support it because the use-case is so minimal, though i've though about adding it. for now, you'll have to create your body-parser. fortunately, that's easy since you can just fork body-parser and leverage raw-body. the main code you'll add around https://github.com/expressjs/body-parser/blob/master/index.js#L80:
var zlib = require('zlib')
var stream
switch (req.headers['content-encoding'] || 'identity') {
case 'gzip':
stream = req.pipe(zlib.createGunzip())
break
case 'deflate':
stream = req.pipe(zlib.createInflate())
break
case 'identity':
break
default:
var err = new Error('encoding not supported')
err.status = 415
next(err)
return
}
getBody(stream || req, {
limit: '1mb',
// only check content-length if body is not encoded
length: !stream && req.headers['content-length'],
encoding: 'utf8'
}, function (err, buf) {
})
Have you tried using the built in compress middleware. It's documented in the expressjs reference documentation
app.use(express.compress());
Maybe you can find something useful here instead: Unzip POST body with node + express