I'm quite new to NodeJS and trying different stuff.
What I was able to do is to download a file going using the following code:
app.get('/download', function(req, res){
var file = 'public/songs/myfile.mp3';
var filename = path.basename(file);
var mimetype = mime.lookup(file);
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', mimetype);
res.setHeader('Content-Length', file.length);
var filestream = fs.createReadStream(file);
filestream.pipe(res);
});
This works well, now what I'm trying to achive is see if it's possible to throttle the download speed. Like if someone tries to download the file it will download at max 1 Mbps (for example).
I've tried to use this code: https://gist.github.com/4poc/1454516
When I load the page it seems to load indefinetly, but I think that the problem is
filestream.pipe(limitStream);
Since that no response is given.
How can I implement what I would like to do? Or how I can fix the code I tried to use?
The req and res objects are streams, so you can pipe on the response:
var filestream = fs.createReadStream(file);
filestream.pipe(limitStream).pipe(res);
fwiw: every time you call pipe() you get back a new stream. The above is the same as this:
var filestream = fs.createReadStream(file);
var throttleStream = filestream.pipe(limitStream);
throttleStream.pipe(res);
This is important to understand because it's tempting to do this, but it won't do what you expect:
var filestream = fs.createReadStream(file);
filestream.pipe(limitStream);
filestream.pipe(res);
Related
var fs = require('fs');
var stream = fs.createWriteStream("my_file.txt");
stream.once('open', function(fd) {
stream.write("My first row\n");
stream.write("My second row\n");
stream.end();
});
If I have the code above , how would I go about downloading my_file.txt to a downloads folder on a users device (i.e laptop or mobile device).
So I have chat messages on screen and I want to write them to file the user can download for reference!
Okay, If you have file you can download it as:
app.get('/download', function (req, res) {
var file = 'file_path_goes_here';
var filename = path.basename(file);
var mimetype = mime.lookup(file);
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', mimetype);
var filestream = fs.createReadStream(file);
filestream.pipe(res);
});
Complete Code is available as node-cheat at express_server_download_file, run node app followed by npm install express mime.
Im trying to send a file on the server to the browser as a download. res.download is cashing the browser.
Any ideas?
Code
var filename = path.basename(userPathZip);
var mimetype = mime.lookup(userPathZip);
res.setHeader('Content-disposition', 'attachment; filename=test.zip');
res.setHeader('Content-type', mimetype);
var filestream = fsextra.createReadStream(userPathZip);
filestream.pipe(res);
in the network tab in chrome the response is 24mb which is the size of the file so im not sure whats goin on here
var filename = path.basename(userPathZip);
var mimetype = mime.lookup(userPathZip);
res.setHeader('Content-disposition', 'attachment; filename=test.zip');
res.setHeader('Content-type', mimetype);
var filestream = fsextra.createReadStream(userPathZip);
filestream.pipe(res);
it is res.pipe(filestream);
What would be the best was to serve files to admin only? Don't want the files making it to a public folder
I need to stream files from a client (nodejs command line) and a server (express nodejs).
This is the client side:
var request = require('request');
var fs = require('fs');
// ...
var readStream = fs.createReadStream(file.path);
readStream.on('end', function() {
that.emit('finished');
});
readStream.pipe(request.post(target));
// ...
This is the server side:
var fs = require('fs');
var path = require('path');
// ...
app.post('/:filename', function(req, res) {
req.setEncoding('binary');
var filename = path.basename(req.params.filename);
filename = path.resolve(destinationDir, filename);
var dst = fs.createWriteStream(filename);
req.pipe(dst);
req.on('end', function() {
res.send(200);
});
});
// ...
All is working, files are saved correctly on the server side... but they are about 50% bigger than the source files. I tried to see difference between the two files with hexdump and the server side file has similar content but with 0xC2 sometimes. I guess this is related to encoding.
Don't call req.setEncoding('binary').
This will convert every single chunk into strings and is mainly intended if you want to read strings from the stream. As you directly pipe the request to a file, you don't need to do it.
Let's say you create a zip file in-memory following the example from node-zip's documentation:
var zip = new require('node-zip')()
zip.file('test.file', 'hello there')
var data = zip.generate({type:'string'})
How do you then send data to a browser such that it will accept it as a download?
I tried this, but the download hangs at 150/150 bytes AND makes Chrome start eating 100% CPU:
res.setHeader('Content-type: application/zip')
res.setHeader('Content-disposition', 'attachment; filename=Zippy.zip');
res.send(data)
So what's the proper way to send zip data to a browser?
Using the archiver and string-stream packages:
var archiver = require('archiver')
var fs = require('fs')
var StringStream = require('string-stream')
http.createServer(function(request, response) {
var dl = archiver('zip')
dl.pipe(response)
dl.append(new fs.createReadStream('/path/to/some/file.txt'), {name:'YoDog/SubFolder/static.txt'})
dl.append(new StringStream("Ooh dynamic stuff!"), {name:'YoDog/dynamic.txt'})
dl.finalize(function (err) {
if (err) res.send(500)
})
}).listen(3000)
I recommend you to use streams for this approach.
var fs = require('fs');
var zlib = require('zlib');
var http = require('http');
http.createServer(function(request, response) {
response.writeHead(200, { 'Content-Type': 'application/octet-stream' });
var readStream = fs.createReadStream('test.file');
var unzipStream = zlib.createUnzip();
readStream.pipe(unzipStream.pipe(response));
}).listen(3000);
This will properly not work in real world (as I am not common with zlib) but it may give you the direction
In most examples you find on the web, an index.html file is served like the following:
function serveIndexPage(response) {
fs.readFile('__dirname + /public/index.html', function (err, data) {
response.end(data);
});
};
This seems like a bad idea, as the whole file is read into memory and then send to the client. Is there some better way to do this? I know that libaries like Connect and Express provide such a functionality, but for my project, I'd like to use plain node.js.
EDIT
Also, you sometimes see readFileSync used, which is even worse IMHO.
Pipe your data through, so a simple static HTTP server looks like:
var Http = require('http'),
Url = require('url'),
Path = require('path'),
Fs = require('fs');
Http.createServer(function(req, res) {
var path = Url.parse(req.url).pathname;
var file = Path.join(process.cwd(), path);
path.exists(filename, function(exists) {
if(!exists) {
res.writeHead(404);
res.end();
}
res.writeHead(200, /* mime type */);
var fileStream = Fs.createReadStream(filename);
fileStream.pipe(res);
});
}).listen(process.env.PORT || 1999);
The pipe'ing is shorthand for something like
var s = Fs.createReadStream(filename);
s.on('data', function (data) {
res.write(data);
});
s.on('end', function() {
res.end();
});
In theory you could read the file line by line response.write()'ing every line to the client.