crypto.decipher causes stream not to close in node js - node.js

I'm trying to decrypt a file and send it in a response to a client. It works fine for just downloading the file, like this:
input.pipe(res);
but when I add the decipher into the pipe, like this:
input.pipe(decipher).pipe(res);
It causes the file download to stay open in the browser. Do I need to close the decipher stream or something?
Here's the full method:
router.get('/', function(req, res, next) {
var filePath = 'C:\\Users\\Anthony\\test';
var stat = fs.statSync(filePath);
var key = '1234asdf';
var decipher = crypto.createDecipher('aes-256-cbc', key)
res.setHeader('Content-Length', stat.size);
res.setHeader('Content-disposition', 'attachment; filename=test.mp4');
var input = fs.createReadStream(filePath);
input.pipe(decipher).pipe(res);
});

Most likely what is happening is that you're giving the browser the encoded file length and not the decrypted file length, which may be different. You could try omitting the Content-Length header entirely and see if that works (this will cause chunked encoding to be used instead).

Related

Serve clickable download URL in NodeJS

At my endpoint in my NodeJS server, after retrieving an audio file stored as a Buffer in MongoDB, I want to represent it with a URL (much like how you do with URL.createObjectURL(blob) in the frontend on the browser). I then plan to res.render() the URL in HTML through Handlebars on the client, so that the user can click on it to download it:
<a href={{url}}>Click me to download the file!</a>
In the NodeJs server, I have converted the MongoDB Buffer into a JavaScript ArrayBuffer through:
var buffer = Buffer.from(recordingFiles[0].blobFile);
var arrayBuffer = Uint8Array.from(buffer).buffer;
I am unsure where to proceed from here. I seen solutions using fs or res.download(), but they don't seem applicable to my situation. Thanks in advance for any help!
Hopefully this can help.
var blob = new Blob(BUFFER, {type: "audio mime type"});
var link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
var fileName = reportName;
link.download = fileName;
link.click();
Do you always need to preload the audio file onto the page?
If not, then I would advise you to add a separate endpoint to download the file on demand. The frontend link can send a get request to the endpoint and download the file only if the user clicked it.
Otherwise you'd always be downloading the buffer behind the scenes, even if the user didn't intend to download it. This is especially problematic on slow connections.
Frontend:
<a href={{`${baseUrl}/download/${audioId}`}}>Click me to download the file!</a>
Backend:
const stream = require('stream');
app.get('/download/:audioId', function (request, response) {
// Retrieve the tag from our URL path
const audioId = request.params.audioId;
const fileData; // TODO: Get file buffer from mongo.
const fileContents = Buffer.from(fileData, 'base64');
const readStream = new stream.PassThrough();
readStream.end(fileContents);
response.set('Content-disposition', 'attachment; filename=' + fileName);
response.set('Content-Type', '<your MIME type here>');
readStream.pipe(response);
});
A list of relevant MIME types can be found here.

Node Express Fast CSV download to client

I've set up a small node js BE app, built with express and fastCsv module on top of it. The desired outcome would be to be able to download a csv file to the client side, without storing it anywhere inside the server, since the data is generated depending on user criteria.
So far I've been able to get somewhere it it, Im using streams, since that csv file could be pretty large depending on the user selection. Im pretty sure something is missing inside the code bellow:
const fs = require('fs');
const fastCsv = require('fast-csv');
.....
(inside api request)
.....
router.get('/', async(req, res) => {
const gatheredData ...
const filename = 'sometest.csv'
res.writeHead(200, {
'Content-Type': 'text/csv',
'Content-Disposition': 'attachment; filename=' + filename
})
const csvDataStream = fastCsv.write(data, {headers: true}).pipe(res)
})
The above code 'works' in some way as it does deliver back the response, but not the actual file, but the contents of the csv file, which I can view in the preview tab as a response. To sum up, Im trying to stream in that data, into a csv and push it to download file to client, and not store it on the server. Any tips or pointers are very much appreciated.
Here's what worked for me after created a CSV file on the server using the fast-csv package. You need to specify the full, absolute directory path where the output CSV file was created:
const csv = require("fast-csv");
const csvDir = "abs/path/to/csv/dir";
const filename = "my-data.csv";
const csvOutput = `${csvDir}/${filename}`;
console.log(`csvOutput: ${csvOutput}`); // full path
/*
CREATE YOUR csvOutput FILE USING 'fast-csv' HERE
*/
res.type("text/csv");
res.header("Content-Disposition", `attachment; filename="${filename}"`);
res.header("Content-Type", "text/csv");
res.sendFile(filename, { root: csvDir });
You need to make sure to change the response content-type and headers to "text/csv", and try enclosing the filename=... part in double-quotes, like in the above example.

Can't write/append to JSON file in Node Webkit

I want to have persistent memory (store the user's progress) in a .json file in %AppData%. I tried doing this according to this post, but it doesn't work. For testing purposes I'm only working with storing one object.
The code below doesn't work at all. If I use fs.open(filePath, "w", function(err, data) { ... instead of readFile(..., it does create a json file in %AppData%, but then it doesn't write anything to it, it's always 0 bytes.
var nw = require('nw.gui');
var fs = require('fs');
var path = require('path');
var file = "userdata.json";
var filePath = path.join(nw.App.dataPath, file);
console.log(filePath); // <- This shows correct path in Application Data.
fs.readFile(filePath ,function (err, data) {
var idVar = "1";
var json = JSON.parse(data);
json.push("id :" + idVar);
fs.writeFile(filePath, JSON.stringify(json));
});
If anyone has any idea where I'm messing this up, I'd be grateful..
EDIT:
Solved, thanks to kailniris.
I was simply trying to parse an empty file
There is no json in the file you try to read. Before parsing data check if the file is empty. If it is then create an empty json, push the new data into it then write it to the file else parse the json in the file.

NODEJS Create zip from byte string

I have to make a POST request to a server that returns me this.
And I have to write a ZIP from that, how do I get the bytes from that string to generate my zip file?
you just need to create a buffer from the api response and create a zip file using that buffer.
var fs = require('fs');
var buff = new Buffer(response_from_api);
fs.writeFile("./test1.zip", buff, function(err){
//do something
});

nodejs web root

I was under the impression that when you run a nodejs webserver the root of the web is the folder containing the js file implementing the webserver. So if you have C:\foo\server.js and you run it, then "/" refers to C:\foo and "/index.htm" maps to C:\foo\index.htm
I have a file server.js with a sibling file default.htm, but when I try to load the contents of /default.htm the file is not found. An absolute file path works.
Where is "/" and what controls this?
Working up a repro I simplified it to this:
var fs = require('fs');
var body = fs.readFileSync('/default.htm');
and noticed it's looking for this
Error: ENOENT, no such file or directory 'C:\default.htm'
So "/" maps to C:\
Is there a way to control the mapping of the web root?
I notice that relative paths do work. So
var fs = require('fs');
var body = fs.readFileSync('default.htm');
succeeds.
I believe my confusion arose from the coincidental placement of my original experiment's project files at the root of a drive. This allowed references to /default.htm to resolve correctly; it was only when I moved things into a folder to place them under source control that this issue was revealed.
I will certainly look into express, but you haven't answered my question: is it possible to remap the web root and if so how?
As a matter of interest this is server.js as it currently stands
var http = require('http');
var fs = require('fs');
var sys = require('sys');
var formidable = require('formidable');
var util = require('util');
var URL = require('url');
var QueryString = require('querystring');
var mimeMap = { htm : "text/html", css : "text/css", json : "application/json" };
http.createServer(function (request, response) {
var body, token, value, mimeType;
var url = URL.parse(request.url);
var path = url.pathname;
var params = QueryString.parse(url.query);
console.log(request.method + " " + path);
switch (path) {
case "/getsettings":
try {
mimeType = "application/json";
body = fs.readFileSync("dummy.json"); //mock db
} catch(exception) {
console.log(exception.text);
body = exception;
}
break;
case "/setsettings":
mimeType = "application/json";
body="{}";
console.log(params);
break;
case "/":
path = "default.htm";
default:
try {
mimeType = mimeMap[path.substring(path.lastIndexOf('.') + 1)];
if (mimeType) {
body = fs.readFileSync(path);
} else {
mimeType = "text/html";
body = "<h1>Error</h1><body>Could not resolve mime type from file extension</body>";
}
} catch (exception) {
mimeType = "text/html";
body = "<h1>404 - not found</h1>" + exception.toString();
}
break;
}
response.writeHead(200, {'Content-Type': mimeType});
response.writeHead(200, {'Cache-Control': 'no-cache'});
response.writeHead(200, {'Pragma': 'no-cache'});
response.end(body);
}).listen(8124);
console.log('Server running at http://127.0.0.1:8124/');
I'm not completely certain what you mean by "routes" but I suspect that setsettings and getsettings are the sort of thing you meant, correct me if I'm wrong.
Nodejs does not appear to support arbitrary mapping of the web root.
All that is required is to prepend absolute web paths with a period prior to using them in the file system:
var URL = require('url');
var url = URL.parse(request.url);
var path = url.pathname;
if (path[0] == '/')
path = '.' + path;
While you're correct that the root of the server is the current working directory Node.js won't do a direct pass-through to the files on your file system, that could be a bit of a security risk after all.
Instead you need to provide it with routes that then in turn provide content for the request being made.
A simple server like
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(1337, '127.0.0.1');
Will just capture any request and respond in the same way (but doesn't read the file system).
Now if you want to serve out file contents you need to specify some way to read that file into the response stream, this can be done a few ways:
You can use the fs API to find the file on disk, read its contents into memory and then pipe them out to the response. This is a pretty tedious approach, especially when you start getting a larger number of files, but it does allow you very direct control over what's happening in your application
You can use a middleware server like express.js, which IMO is a much better approach to do what you're wanting to do. There's plenty of questions and answers on using Express here on StackOverflow, this is a good example of a static server which is what you talk about
Edit
With the clarification of the question the reason:
var body = fs.readFileSync('/default.htm');
Results in thinking the file is at C:\default.htm is because you're using an absolute path not a relative path. If you had:
var body = fs.readFileSync('./default.htm');
It would then know that you want to operate relative to the current working directory. / is from the root of the partition and ./ is from the current working directory.

Resources