Serve clickable download URL in NodeJS - node.js

At my endpoint in my NodeJS server, after retrieving an audio file stored as a Buffer in MongoDB, I want to represent it with a URL (much like how you do with URL.createObjectURL(blob) in the frontend on the browser). I then plan to res.render() the URL in HTML through Handlebars on the client, so that the user can click on it to download it:
<a href={{url}}>Click me to download the file!</a>
In the NodeJs server, I have converted the MongoDB Buffer into a JavaScript ArrayBuffer through:
var buffer = Buffer.from(recordingFiles[0].blobFile);
var arrayBuffer = Uint8Array.from(buffer).buffer;
I am unsure where to proceed from here. I seen solutions using fs or res.download(), but they don't seem applicable to my situation. Thanks in advance for any help!

Hopefully this can help.
var blob = new Blob(BUFFER, {type: "audio mime type"});
var link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
var fileName = reportName;
link.download = fileName;
link.click();

Do you always need to preload the audio file onto the page?
If not, then I would advise you to add a separate endpoint to download the file on demand. The frontend link can send a get request to the endpoint and download the file only if the user clicked it.
Otherwise you'd always be downloading the buffer behind the scenes, even if the user didn't intend to download it. This is especially problematic on slow connections.
Frontend:
<a href={{`${baseUrl}/download/${audioId}`}}>Click me to download the file!</a>
Backend:
const stream = require('stream');
app.get('/download/:audioId', function (request, response) {
// Retrieve the tag from our URL path
const audioId = request.params.audioId;
const fileData; // TODO: Get file buffer from mongo.
const fileContents = Buffer.from(fileData, 'base64');
const readStream = new stream.PassThrough();
readStream.end(fileContents);
response.set('Content-disposition', 'attachment; filename=' + fileName);
response.set('Content-Type', '<your MIME type here>');
readStream.pipe(response);
});
A list of relevant MIME types can be found here.

Related

Create a Blob for Video in NodeJs18 and Use it on Client Side

Learning NodeJs and Blob, and I Try to Create a Blob from a Video File in NodeJS. And send this in a Json file to my Client. Data will be Fetch inside GetStaticProps in NextJS.
Here is What I Created in NodeJS Server :
const fileBuffer = fs.readFileSync(filePath); // Path is something like example.com/video.mp4
const blob = new Blob([fileBuffer], { type: 'video/mp4' });
blobUrl = URL.createObjectURL(blob); // Return blob:nodedata:c3b1baf2-fba8-404f-8d3c-a1184a3a6db2
It retrun this :
blob:nodedata:c3b1baf2-fba8-404f-8d3c-a1184a3a6db2
But How can I use it in Client ? <video src="blob:nodedat..." is not Working, so I am doing something wrong for sure
Can you help me uderstand what is wrong ?

Node Express Fast CSV download to client

I've set up a small node js BE app, built with express and fastCsv module on top of it. The desired outcome would be to be able to download a csv file to the client side, without storing it anywhere inside the server, since the data is generated depending on user criteria.
So far I've been able to get somewhere it it, Im using streams, since that csv file could be pretty large depending on the user selection. Im pretty sure something is missing inside the code bellow:
const fs = require('fs');
const fastCsv = require('fast-csv');
.....
(inside api request)
.....
router.get('/', async(req, res) => {
const gatheredData ...
const filename = 'sometest.csv'
res.writeHead(200, {
'Content-Type': 'text/csv',
'Content-Disposition': 'attachment; filename=' + filename
})
const csvDataStream = fastCsv.write(data, {headers: true}).pipe(res)
})
The above code 'works' in some way as it does deliver back the response, but not the actual file, but the contents of the csv file, which I can view in the preview tab as a response. To sum up, Im trying to stream in that data, into a csv and push it to download file to client, and not store it on the server. Any tips or pointers are very much appreciated.
Here's what worked for me after created a CSV file on the server using the fast-csv package. You need to specify the full, absolute directory path where the output CSV file was created:
const csv = require("fast-csv");
const csvDir = "abs/path/to/csv/dir";
const filename = "my-data.csv";
const csvOutput = `${csvDir}/${filename}`;
console.log(`csvOutput: ${csvOutput}`); // full path
/*
CREATE YOUR csvOutput FILE USING 'fast-csv' HERE
*/
res.type("text/csv");
res.header("Content-Disposition", `attachment; filename="${filename}"`);
res.header("Content-Type", "text/csv");
res.sendFile(filename, { root: csvDir });
You need to make sure to change the response content-type and headers to "text/csv", and try enclosing the filename=... part in double-quotes, like in the above example.

Get file name in express request stream

Im wondering if is posible to know what is the file name of an incomming binary request.
This is my situation I have this code that handles the file upload
router.route('/:filename')
.put(function(req,res){
var uuid = guid();
var fileExtension = req.params.filename.substring(req.params.filename.lastIndexOf("."));
if(!fs.existsSync('../files')){
fs.mkdirSync('../files')
}
var newFile = fs.createWriteStream('../files/'+uuid+fileExtension);
req.pipe(newFile);
req.on('end',function(end){
console.log("Finished")
res.send(uuid+fileExtension)
})
})
as you can see now ,I need the file name specified in the URL('/:filename'). My question is: If it is possible to take that attribute from the resquest stream, instead the url or a form key?
If you use multer middleware you can access the uploaded filename like so
var multer = require('multer')
var upload = multer()
router.route('/:filename')
.put(upload.single('fileField'),function(req,res){
var fileName = req.file.originalname
var uuid = guid();
var fileExtension = req.params.filename.substring(req.params.filename.lastIndexOf("."));
if(!fs.existsSync('../files')){
fs.mkdirSync('../files')
}
var newFile = fs.createWriteStream('../files/'+uuid+fileExtension);
req.pipe(newFile);
req.on('end',function(end){
console.log("Finished")
res.send(uuid+fileExtension)
})
})
You'll need to inspect the Content-Disposition header of the request and parse the file name information out if processing the HTTP request manually.
However, I'd recommend you look at some of the existing file upload middlewares, no point in reinventing the wheel
busboy
multer
formidable
multiparty
pez

How to pass PDFKit readable stream into request's post method?

My app needs to create a PDF file and then upload it to another server. The upload happens down the line via the post method from the request NPM package. Everything works fine if I pass in an fs.createReadStream:
const fs = require('fs');
const params = {file: fs.createReadStream('test.pdf')};
api.uploadFile(params);
Since PDFKit instantiates a read stream as well, I'm trying to pass that directly into the post params like this:
const PDFDocument = require('pdfkit');
const doc = new PDFDocument();
doc.text('steam test');
doc.end();
const params = {file: doc};
api.uploadFile(params);
However, this produces an error:
TypeError: Path must be a string. Received [Function]
If I look at PDFKit source code I see (in coffeescript):
class PDFDocument extends stream.Readable
I'm new to streams and it's clear I'm not understanding the difference here. To me if they are both readable streams, they should both be able to be passed in the same way.

How to gzip http request post(client) data for node.js server

I have implemented node.js server application, which accepts post data from client(long json string). Is there a way I can gzip the post data at browser end and unzip it in node.js?
I specifically want to gzip the request and not response.
check https://jsfiddle.net/gynz82tg/
decompress in nodejs just same after you get the base64 encoded request string.
var jsonStr = JSON.stringify({
name: "JiangYD"
})
$('#origin').text(jsonStr);
var zip = new JSZip();
zip.file("data", jsonStr);
var content = zip.generate();
$('#compressed').text(content);
zip = new JSZip(content, {base64:true});
$('#decompressed').text(zip.file("data").asText());
<script src="https://raw.githubusercontent.com/Stuk/jszip/master/dist/jszip.js"></script>
<div id='origin'></div>
<div id='compressed'></div>
<div id='decompressed'></div>
UPDATE
because jsZip update the API
https://jsfiddle.net/cvuqr6h4/
async function go(){
const jsonStr = JSON.stringify({
name: "JiangYD"
})
$('#origin').text(jsonStr);
let zip = new JSZip();
zip.file("data", jsonStr);
const content = await zip.generateAsync({type : "base64"});
$('#compressed').text(content);
zip = new JSZip();
await zip.loadAsync(content, {base64:true});
const decoded = await zip.file("data").async('string');
$('#decompressed').text(decoded);
}
go();
You could try this: https://github.com/sapienlab/jsonpack
Example Client Code:
<script src="jsonpack.js" />
<script>
var BIG_JSON = {.....};
var packed = jsonpack.pack(BIG_JSON);
$.post('path_to_server',packed);
</script>
Example Nodejs Code:
var jsonpack = require('jsonpack/main');
app.on('/packed_data',function(req,res){
try{
jsonpack.unpack(req.data);
}catch(e){
//not good packed data.
}
})
This is a sample code of course i don't know what framework or libraries you use, but you can see how this could be implemented.
Anyway be careful with this because zipping and unzipping data is always a heavy cpu bound task. If you have several megabytes of data you dont want to make your users from phones,tablets etc.. to make this tasks!

Resources