Express res.download() not actually downloading file - node.js

I'm attempting to return generated files to the front end through Express' res.download function. I'm using chrome, but whenever I call that API that executes the following code all that is returned is the same values returned from the Express res.sendFile() function.
I know that res.download uses res.sendFile, but I would like the download function to actually save to the file system instead of just returning the file in the body of the response.
This is my code.
exports.download = function(req,res) {
var filePath = //somefile that I want to download
res.download(filePath, 'response.txt', function(err) {
throw err;
}
}
I know that the above code at least partly works because I'm getting back, in the response, the contents of the file. However, I want it to be saved onto the file system.
Am I misunderstanding what the download function is supposed to do? Do I just need to take the response data and write it to the file system manually?

res.download adds headers that suggest to the browser that the file should be downloaded rather than opened. However, there's no way to force the browser to do this; it's ultimately the user's choice whether to download a particular file, typically.
If you're triggering this request with AJAX, well, that's not going to cause it to be downloaded, because your JavaScript is requesting that it get the data.
Do I just need to take the response data and write it to the file system manually?
You don't have file system access in browser-side JavaScript. I'm not sure how you intend to do this.

Related

Downloading Binary File from OneDrive API Using Node/Axios

I am using the One Drive API to grab a file with a node application using the axios library.
I am simply trying to save the file to the local machine (node is running locally).
I use the One Drive API to get the download document link, which does not require authentication (with https://graph.microsoft.com/v1.0/me/drives/[location]/items/[id]).
Then I make this call with the download document link:
response = await axios.get(url);
I receive a JSON response, which includes, among other things, the content-type, content-length, content-disposition and a data element which is the contents of the file.
When I display the JSON response to the console, the data portion looks like this:
data: 'PK\u0003\u0004\u0014\u0000\u0006\u0000\b\u0000\u0000\u0000!\u...'
If the document is simply text, I can save it easily using:
fs.writeFileSync([path], response.data);
But if the file is binary, like a docx file, I cannot figure out how to write it properly. Every time I try it seems to have the wrong encoding. I tried different encodings.
How do I save the file properly based on the type of file retrieved.
Have you tried using an encoding option of fs.writeFileSync of explicitly null, signifying the data is binary?
fs.writeFileSync([path], response.data, {
encoding: null
});

How to load external file with parameters

So I'm creating a script that uses nightmareJS and requests. I'm making the requests grab data from a webpage and then have nightmareJS navigate to a page as well. I'm then injecting a javascript file into the nightmare session using
.inject('js', 'injectFile.js')
This all works perfectly, however im trying to achieve something else. After grabbing data from the other page using requests, i would like to pass that data into the injectFile.js file. For example, I would get a url with the request. and then use that url in the injectFile.js file when it is called. Is there anyway / module to achieve this? Thanks in advance
The best way to do this is to define a function in injectFile.js, so that whatever you're doing doesn't run immediately when you inject the file, but only when you call the function:
function doStuff(params) {
// do stuff with params
// (this probably contains your entire injectFile.js script)
}
Then use nightmare.evaluate to call that function after you have injected it into the browser context:
nightmare.evaluate(function(params) {
doStuff(params);
}, yourFavoriteParamValues)

Piping a readstream into a writestream does not work

I (as the client) am trying to post an image with restify, and the server just needs to save it.
req.pipe(fs.createWriteStream('test.jpg'));
is not working. An empty file is created but nothing more. It works when I copy req.body into a buffer and then fs.writeFile(...). I have also tried req.body.pipe, but this throws an error.
You're probably using a body parser middleware that is already reading all of the data from the request so there is nothing left to read. Try adjusting the placement of your route handler and/or body parsing middleware if you want to read directly from the request object.
However, that will only work if the request contains only the image data. Typically a request is formatted as multipart/form-data if it contains at least one file, so you cannot just pipe the request and expect image data only.
So something else in your middleware chain, probably restify.bodyParser(), is already streaming the request body into a buffer or string as req.body and you can't stream something twice. Find the middleware and disable it for this route if you want to handle the streaming straight to the filesystem yourself.

nodejs require() json from file garbage collection

I'm using a file for storing JSON data. My module makes CRUD actions on the file and I'm using require() to load the json, instead of fs.readFile(). The issue is, if the file is deleted, using fs.unlink(), then calling the file again using require still loads the file... which has just been deleted. I'm a bit lost how to get around this, possibly #garbage-collection?
Example:
fs.writeFile('foo.json', JSON.stringify({foo:"bar"}), function(){
var j = require('./foo.json')
fs.unlink('./foo.json', function(){
console.log('File deleted')
var j = require('./foo.json')
console.log(j)
})
})
When loading a module using require, Node.js caches the loaded module internally so that subsequent calls to require do not need to access the drive again. The same is true for .json files, when loaded using require.
That's why your file still is "loaded", although you deleted it.
The solution to this issue is to use the function for loading a file that is appropriate for it, which you already mentioned: fs.readFile(). Once you use that, everything will work as expected.

How to upload a file to a user after a front-end action in Node.js

I'm working on a project built entirely in node.js and coffeescript. I want to allow the user to export a CSV of several different collections in my Mongo DB by clicking a button on my website.
I believe the best way to do this would be to make an ajax call to my node.js backend and have that call return somefile.csv to the user. I'm at a loss at how to do this though, and there are so many conflicting resources. Here's the stub of how I think things should work:
exports.exportToCSV = (req, res) ->
console.log 'Inside exportToCSV'
# Create a dynamic csv file
# How to?
# Set the response headers
# How to?
# Attach the newly created CSV
# How to?
# Write the response
res.write('somefile.csv')
res.end()
Any help would be greatly appreciated. Thank you.
If you're using Express (and I'd say you need a pretty big excuse not to), everything after creating the CSV is a piece of cake:
res.download 'somefile.csv'
As the Express docs explain, that's shorthand for
res.attachment 'somefile.csv'
(which sets the headers) and
res.sendfile 'somefile.csv'
If you want to understand how it all works, here's the source: https://github.com/visionmedia/express/blob/master/lib/response.js
As to creating a CSV, I've never had to do this, but you can't go wrong searching npm for "csv".

Resources