How to serve a file not saved in folder, stored in postgresql database, and to be rendered to HTML by expressjs? - node.js

We first sent the file to be posted as following:
router.post('/savefile', multer().single('filesource'), (q,s) => {
db.query("insert into filesource (filerefid, source) values ($1, $2) returning filerefid", [ q.body.filerefid, q.file ])
.then (r=> s.send(r.rows[0]))
})
And then tried to show that as following to the HTML:
<object type="application/pdf" data="http://localhost:3000/getsource/1"></object>
Which is requested by:
router.get('/getsource/:i', (q,s) => {
db.query('select source from filesource where filerefid = $1;',
[q.params.i])
.then (r=> s.send(Buffer.from(r.rows[0].source.buffer)))
})
However the files cant be showed or downloaded correctly like that, What is the best way of doing that, without saving file in any directory?

I suppose you are missing content type header in the response.
res.set('Content-Type', 'application/pdf');
Also consider setting Content-Disposition header to present a correct file name.
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Disposition
Content-Type: application/pdf
Content-Disposition: attachment; filename="file.pdf"
Update
file property which is provided by multer is a js object, which contains buffer property with binary data of the file. In http get method this binary data should be returned decorated by the required http headers mentioned above.

As mentioned by Oleg Flores answer for correct header, and the notice in his comment, that we should use the buffer from the object, that in the case of not saving any file by Multer, values of the insert query should be
[ q.body.filerefid, q.file.buffer]

Related

"Error: MultipartParser.end(): stream ended unexpectedly" error when uploading a file

I'm uploading a file from a Buffer using form-data. On the server side I'm using formidable to parse the file data. I keep getting errors like this on the server:
Error: MultipartParser.end(): stream ended unexpectedly: state = START
or
Error: MultipartParser.end(): stream ended unexpectedly: state = PART_DATA
I'm not sure if this is a problem with form-data or formidable. I found a lot of solutions (mostly involving not setting the Content-Type header manually). However, I couldn't find one that resolved the issue for me. I ended up figuring something out, so posting in order to answer.
I encountered this issue while developing a Strapi upload provider. Strapi provides information about a file that needs to be uploaded to a service. The file contents are provided as a Buffer (for some reason). Here's what my code looked like when I was getting the error (modified slightly):
const form = new FormData()
form.append('file', Readable.from(file.buffer))
form.append('name', file.name)
form.append('hash', file.hash)
form.append('mime', file.mime)
form.on('error', () => abortController.abort())
return fetch(url, {
method: 'post',
body: form,
signal: abortController.signal,
}))
Again, I'm not sure if this is a problem with form-data or formidable, but if I provide a filename and knownLength to form-data, the issue goes away. This is what my final code looks like (modified slightly):
const fileStream = Readable.from(file.buffer)
const fileSize = Buffer.byteLength(file.buffer)
const abortController = new AbortController()
const form = new FormData()
form.append('file', fileStream, {filename: file.name, knownLength: fileSize})
form.append('name', file.name)
form.append('hash', file.hash)
form.append('mime', file.mime)
form.on('error', () => abortController.abort())
return fetch(url, {
method: 'post',
body: form,
signal: abortController.signal,
}))
I've tried logging the form.on('error') result and I get nothing (it's not aborting).
I've tried just setting filename and I get the same error.
I've tried just setting knownLength. The file uploads but it's empty (at least, formidable thinks it is). It must need the filename in order to parse the file correctly?
This is likely an issue with form-data not reading the input stream properly or not writing to the output stream properly (I did notice looking at the raw form data on the server that the file data was truncated) or with formidable not reading the file data properly. There's something about setting the filename and knownLength that bypasses the issue.
UPDATE: This may have been partially fixed in a newer version of form-data. I updated the package and no longer need to set the knownLength. I still need to set filename though. Without it, the server thinks the file is empty.

Return JSON response in ISO-8859-1 with NodeJS/Express

I have built an API with Node and Express which returns some JSON. The JSON data is to be read by a web application. Sadly this application only accepts ISO-8859-1 encoded JSON which has proven to be a bit difficult.
I can't manage to return the JSON with the right encoding even though I've tried the methods in the Express documentation and also all tips from googling the issue.
The Express documentation says to use "res.set()" or "res.type()" but none of these is working for me. The commented lines are all the variants that I've tried (using Mongoose):
MyModel.find()
.sort([['name', 'ascending']])
.exec((err, result) => {
if (err) { return next(err) }
// res.set('Content-Type', 'application/json; charset=iso-8859-1')
// res.set('Content-Type', 'application/json; charset=ansi')
// res.set('Content-Type', 'application/json; charset=windows-1252')
// res.type('application/json; charset=iso-8859-1')
// res.type('application/json; charset=ansi')
// res.type('application/json; charset=windows-1252')
// res.send(result)
res.json(result)
})
None of these have any effect on the response, it always turns into "Content-Type: application/json; charset=utf-8".
Since JSON should(?) be encoded in utf-8, is it event possible to use any other encoding with Express?
If you look at the lib/response.js file in the Express source code (in your node_modules folder or at https://github.com/expressjs/express/blob/master/lib/response.js) you'll see that res.json takes your result, generates the corresponding JSON representation in a JavaScript String, and then passes that string to res.send.
The cause of your problem is that when res.send (in that same source file) is given a String argument, it encodes the string as UTF8 and also forces the charset of the response to utf-8.
You can get around this by not using res.json. Instead build the encoded response yourself. First use your existing code to set up the Content-Type header:
res.set('Content-Type', 'application/json; charset=iso-8859-1')
After that, manually generate the JSON string:
jsonString = JSON.stringify(result);
then encode that string as ISO-8859-1 into a Buffer:
jsonBuffer = Buffer.from(jsonString, 'latin1');
Finally, pass that buffer to res.send:
res.send(jsonBuffer)
Because res.send is no longer being called with a String argument, it should skip the step where it forces charset=utf-8 and should send the response with the charset value that you specified.

Node buffer doesn't properly download file?

I'm successfully sending a get request that generates a pdf on the server, which I now want to send back to client and download it on their browser. The npm pdf generating library I'm using is called html-pdf and has the following options:
pdf.create(html).toFile([filepath, ]function(err, res){
console.log(res.filename);
});
pdf.create('<h1>Hi</h1>').toBuffer(function(err, buffer){
console.log('This is a buffer:', Buffer.isBuffer(buffer));
});
When I use the toFile option, the file gets correctly generated, however, when I use the toBuffer option and send that back to the user, the resulting pdf is blank.
I send the buffer to the user from my ajax handler like this:
module.exports = function(req, res) {
pdf.create(html).toBuffer(function(err, buffer){
res.setHeader('Content-Disposition', 'attachment; filename=panda.pdf');
res.setHeader('Content-Type', 'application/pdf');
res.send(buffer)
});
};
which gets received on the client here:
$.get('generatePdf', function(data, status) {
const url = window.URL.createObjectURL(new Blob([data]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', 'file.pdf');
document.body.appendChild(link);
link.click();
})
For some reason though the pdf that is downloaded is blank. Does anyone know what I might be doing wrong?
My downloaded file is corrupt according to this online pdf validator with the following errors:
Result Document does not conform to PDF/A. Details Validating file
"file (8).pdf" for conformance level pdf1.4
The 'xref' keyword was not found or the xref table is malformed. The
file trailer dictionary is missing or invalid. The "Length" key of the
stream object is wrong. Error in Flate stream: data error. The
document does not conform to the requested standard. The file format
(header, trailer, objects, xref, streams) is corrupted. The document
does not conform to the PDF 1.4 standard.

why didn't I get exact same file size through node.js?

I have a simple uploading code by node.js.
var http = require('http')
var fs = require('fs')
var server = http.createServer(function(req, res){
if(req.url == '/upload') {
var a = fs.createWriteStream('a.jpg', { defaultEncoding: 'binary'})
req.on('data', function(chunk){
a.write(chunk)
})
.on('end', function()){
a.end()
res.end('okay')
})
}
else {
fs.createReadStream('./index.html').pipe(res);
// just show <form>
}
})
server.listen(5000)
when I upload some image, I cannot get exact same file.
Always files are broken.
When I try to do this using formidable, I can get a fine file.
So I studied formidable but I cannot understand how did it catch data and save.
I could find formidable use parser to calculate something about chunk from request but I did not get it all.
(It is definitely my brain issue :( ).
Anyway, what is the difference between my code and formidable?
What am I missing?
Is it a wrong way to just add all chunks from http request and save it by
fs.createWriteStream or fs.writeFile ?
What concepts am I missing?
First, req is a Readable stream. You can simply do:
req.pipe(fs.createWriteStream('a.jpg'))
for the upload part. This is copying all byte data from request stream to file.
This will work when you send raw file data as the request body:
curl --data-binary #"/home/user/Desktop/a.jpg" http://localhost:8080/upload
Because this sends request body exactly as image binary data, that gets streamed to a file on server.
But, there is another request format called multipart/form-data. This is what web browsers use with <form> to upload files.
curl -form "image=#/home/user1/Desktop/a.jpg" http://localhost:8080/upload
Here the request body contains multiple "parts", one for each file attachment or form field, separated by special "boundary" characters:
--------------------------e3f25f5319cd6624
Content-Disposition: form-data; name="image"; filename="a.jpg"
Content-Type: application/octet-stream
JPG IHDRH-ÑtEXtSoftwareAdobe.....raw file data
--------------------------e3f25f5319cd6624
Hence you will need much more complicated code to extract the file part data from it. NPM Modules like busboy and formidable do exactly that.

Get MIME type of Node Request.js response in Proxy - Display if image

I’m writing some proxy server code which intercepts a request (originated by a user clicking on a link in a browser window) and forwards the request to a third party fileserver. My code then gets the response and forwards it back to the browser. Based on the mime type of the file, I would like to handle the file server's response in one of two ways:
If the file is an image, I want to send the user to a new page that
displays the image, or
For all other file types, I simply want the browser to handle receiving it (typically a download).
My node stack includes Express+bodyParser, Request.js, EJS, and Passport. Here’s the basic proxy code along with some psuedo code that needs a lot of help. (Mia culpa!)
app.get('/file', ensureLoggedIn('/login'), function(req,res) {
var filePath = 'https://www.fileserver.com/file'+req.query.fileID,
companyID = etc…,
companyPW = etc…,
fileServerResponse = request.get(filePath).auth(companyID,companyPW,false);
if ( fileServerResponse.get('Content-type') == 'image/png') // I will also add other image types
// Line above yields TypeError: Object #<Request> has no method 'get'
// Is it because Express and Request.js aren't using compatible response object structures?
{
// render the image using an EJS template and insert image using base64-encoding
res.render( 'imageTemplate',
{ imageData: new Buffer(fileServerResponse.body).toString('base64') }
);
// During render, EJS will insert data in the imageTemplate HTML using something like:
// <img src='data:image/png;base64, <%= imageData %>' />
}
else // file is not an image, so let browser deal with receiving the data
{
fileServerResponse.pipe(res); // forward entire response transparently
// line above works perfectly and would be fine if I only wanted to provide downloads.
}
})
I have no control over the file server and the files won't necessarily have a file suffix so that's why I need to get their MIME type. If there's a better way to do this proxy task (say by temporarily storing the file server's response as a file and inspecting it) I'm all ears. Also, I have flexibility to add more modules or middleware if that helps. Thanks!
You need to pass a callback to the request function as per it's interface. It is asynchronous and does not return the fileServerResponse as a return value.
request.get({
uri: filePath,
'auth': {
'user': companyId,
'pass': companyPW,
'sendImmediately': false
}
}, function (error, fileServerResponse, body) {
//note that fileServerResponse uses the node core http.IncomingMessage API
//so the content type is in fileServerResponse.headers['content-type']
});
You can use mmmagic module. It is an async libmagic binding for node.js for detecting content types by data inspection.

Resources