Download text file with Node - node.js

I want to download and save a text file using Node. But after downloading the pipe method return an error to memory lack. By setting emitter size to zero(0) the error is disappeared but the size of the file exceed to 1 GB!
var download = function (uri, filename, callback) {
var file = fs.createWriteStream(filename);
http.get(uri, function (res) {
res.pipe(file);
res.on('end', function () {
file.end();
});
});
};
ERROR: MaxListenersExceededWarning: Possible EventEmitter memory leak
detected. 11 end listeners added. Use emitter
I couldn't find any solution to save a text file from web. What's the best way to do this?
EDIT:
I move pipe method to solve memory problem. But in the file it puts binary data not text. How can I save file as a text file?

A possible try could be:
npm i request
then following code instead of yours:
var request = require('request');
var fs = require('fs');
var download = function (uri, filename, callback) {
var file = fs.createWriteStream(filename);
request(uri).pipe(fs.createWriteStream(file));
};
This way it will handle memory/network/back-pressure on its own in nearly all cases.

This code writes a UTF-8 string to the text file at filepath, but it appears to be functionally equivalent to your latest update (edit 3).
If you're still getting binary/buffered output after trying this code, please provide the result of typeof res.
const downloadFile = (uri, filepath, callback) => {
http.get(uri, res => {
console.log(typeof res);
res.pipe(fs.createWriteFileStream(filepath))
.on('finish', callback);
});
};

Related

How to combine video upload chunks Node.js

I'm trying to upload a large (8.3GB) video to my Node.js (Express) server by chunking using busboy. How to I receive each chunk (busboy is doing this part) and piece it together as one whole video?
I have been looking into readable and writable streams but I'm not ever getting the whole video. I keep overwriting parts of it, resulting in about 1 GB.
Here's my code:
req.busboy.on('file', (fieldname, file, filename) => {
logger.info(`Upload of '${filename}' started`);
const video = fs.createReadStream(path.join(`${process.cwd()}/uploads`, filename));
const fstream = fs.createWriteStream(path.join(`${process.cwd()}/uploads`, filename));
if (video) {
video.pipe(fstream);
}
file.pipe(fstream);
fstream.on('close', () => {
logger.info(`Upload of '${filename}' finished`);
res.status(200).send(`Upload of '${filename}' finished`);
});
});
After 12+ hours, I got it figured out using pieces from this article that was given to me. I came up with this code:
//busboy is middleware on my index.js
const fs = require('fs-extra');
const streamToBuffer = require('fast-stream-to-buffer');
//API function called first
uploadVideoChunks(req, res) {
req.pipe(req.busboy);
req.busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
const fileNameBase = filename.replace(/\.[^/.]+$/, '');
//save all the chunks to a temp folder with .tmp extensions
streamToBuffer(file, function (error, buffer) {
const chunkDir = `${process.cwd()}/uploads/${fileNameBase}`;
fs.outputFileSync(path.join(chunkDir, `${Date.now()}-${fileNameBase}.tmp`), buffer);
});
});
req.busboy.on('finish', () => {
res.status(200).send(`Finshed uploading chunk`);
});
}
//API function called once all chunks are uploaded
saveToFile(req, res) {
const { filename, profileId, movieId } = req.body;
const uploadDir = `${process.cwd()}/uploads`;
const fileNameBase = filename.replace(/\.[^/.]+$/, '');
const chunkDir = `${uploadDir}/${fileNameBase}`;
let outputFile = fs.createWriteStream(path.join(uploadDir, filename));
fs.readdir(chunkDir, function(error, filenames) {
if (error) {
throw new Error('Cannot get upload chunks!');
}
//loop through the temp dir and write to the stream to create a new file
filenames.forEach(function(tempName) {
const data = fs.readFileSync(`${chunkDir}/${tempName}`);
outputFile.write(data);
//delete the chunk we just handled
fs.removeSync(`${chunkDir}/${tempName}`);
});
outputFile.end();
});
outputFile.on('finish', async function () {
//delete the temp folder once the file is written
fs.removeSync(chunkDir);
}
});
}
Use streams
multer allow you to easily handle file uploads as part of an express route. This works great for small files that don’t leave a significant memory footprint.
The problem with loading a large file into memory is that you can actually run out of memory and cause your application to crash.
use multipart/form-data request. This can be handled by assigning the readStream to that field instead in your request options
streams are extremely valuable for optimizing performance.
Try with this code sample, I think it will work for you.
busboy.on("file", function(fieldName, file, filename, encoding, mimetype){
const writeStream = fs.createWriteStream(writePath);
file.pipe(writeStream);
file.on("data", data => {
totalSize += data.length;
cb(totalSize);
});
file.on("end", () => {
console.log("File "+ fieldName +" finished");
});
});
You can refer this link also for resolve this problem
https://github.com/mscdex/busboy/issues/143
I think multer is good with this, did you try multer?

Pipe file in write stream directly to client as download in node.js

So I'm trying to take a file used as a starting template, add data to it in the stream (not altering original file), and serve it to the client without saving a new file on the server (I'm currently using the express module as well).
So far I pass the data in a post request add add it to the end of the stream. Unfortunately when you pipe the read stream to a write stream you have to specify the output file and location for the write stream. Is there any way around that? Can you set the output file location as the relevant port?
This is what I currently have (getting error: Cannot pipe, not readable):
app.post("/output_xml", function(req, res) {
var data = validateJSON(req.body);
stream_xml(data, res);
});
function stream_xml(data, res)
{
var read_stream = fs.createReadStream(__dirname + '/Static/input_template.xml')
var write_stream = fs.createWriteStream(__dirname + '/Static/output.xml') // trying to prevent saving a file to the server though
read_stream.pipe(write_stream);
read_stream.on('end', () => {
write_stream.write(data);
write_stream.write("\nAdding more stuff");
});
write_stream.pipe(res);
}
Would I be able to swap the write_stream line for anything like:
var write_stream = fs.createWriteStream('http://localhost:3000/output_xml/output.xml')
You cannot pipe from a write stream, but you can certainly pipe from a transform/duplex stream.
So you can do something like:
const { Transform } = require('stream');
const custom_response = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk, encoding);
callback();
},
flush(callback) {
this.push(data);
this.push("\nAdding more stuff");
callback();
},
});
read_stream.pipe(custom_response).pipe(res);
An alternative to stream.Transform may also be stream.PassThrough which takes the same parameters as Transform, but you only need to specify the flush method.
https://nodejs.org/api/stream.html#stream_implementing_a_transform_stream

How to read file to variable in NodeJs?

i'm pretty new into NodeJs. And i am trying to read a file into a variable.
Here is my code.
var fs = require("fs"),
path = require("path"),
util = require("util");
var content;
console.log(content);
fs.readFile(path.join(__dirname,"helpers","test.txt"), 'utf8',function (err,data) {
if (err) {
console.log(err);
process.exit(1);
}
content = util.format(data,"test","test","test");
});
console.log(content);
But every time i run the script i get
undefined and undefined
What am i missing? Help please!
As stated in the comments under your question, node is asynchronous - meaning that your function has not completed execution when your second console.log function is called.
If you move the log statement inside the the callback after reading the file, you should see the contents outputted:
var fs = require("fs"),
path = require("path"),
util = require("util");
var content;
console.log(content);
fs.readFile(path.join(__dirname, "helpers", "test.txt"), 'utf8', function (err, data) {
if (err) {
console.log(err);
process.exit(1);
}
content = util.format(data, "test", "test", "test");
console.log(content);
});
Even though this will solve your immediately problem, without an understanding of the async nature of node, you're going to encounter a lot of issues.
This similar stackoverflow answer goes into more details of what other alternatives are available.
The following code snippet uses ReadStream. It reads your data in separated chunks, if your data file is small it will read the data in a single chunk. However this is a asynchronous task. So if you want to perform any task with your data, you need to include them within the ReadStream portion.
var fs = require('fs');
var readStream = fs.createReadStream(__dirname + '/readMe.txt', 'utf8');
/* include the file directory and file name instead of <__dirname + '/readMe.txt'> */
var content;
readStream.on('data', function(chunk){
content = chunk;
performTask();
});
function performTask(){
console.log(content);
}
There is also another easy way by using synchronous task. As this is a synchronous task, you do not need to worry about its executions. The program will only move to the next line after execution of the current line unlike the asynchronous task.
A more clear and detailed answer is provided in the following link:
Get data from fs.readFile
var fs = require('fs');
var content = fs.readFileSync('readMe.txt','utf8');
/* include your file name instead of <'readMe.txt'> and make sure the file is in the same directory. */
or easily as follows:
const fs = require('fs');
const doAsync = require('doasync');
doAsync(fs).readFile('./file.txt')
.then((data) => console.log(data));

NodeJS writeStream empty file

I am trying to use nodeJS to save a processed image stored in a base64 string.
var buff = new Buffer(base64data,'base64');
console.log(base64data);
var stream = fs.createWriteStream('/path/to/thefile.png');
stream.write(buff)
stream.end()
However, the resulting file is empty.
When I take the output of console.log(base64data); and decode it locally, it produces a valid png binary, so why is the file empty?
The file is a 3600x4800 px png file (i.e. it's huge), could this be a factor?
Also, I tried writeFile as well, no luck.
And yes, fs is require('fs')
Thanks
your stream.end() too soon as nothing is written. it is async function remember.
var buff = new Buffer(base64data,'base64');
console.log(base64data);
var stream = fs.createWriteStream('/path/to/thefile.png');
stream.write(buff);
stream.on("end", function() {
stream.end();
});
Better:
var buff = new Buffer(base64data,'base64');
console.log(base64data);
var stream = fs.createWriteStream('/path/to/thefile.png');
stream.write(buff);
stream.end();
stream.on('finish', () => {
//'All writes are now complete.'
});
stream.on('error', (error) => {...});

Resizing images with Nodejs and Imagemagick

Using nodejs and imagemagick am able to re-size an image and send it to the browser with this.
var http = require('http'),
spawn = require('child_process').spawn;
http.createServer(function(req, res) {
var image = 'test.jpg';
var convert = spawn('convert', [image, '-resize', '100x100', '-']);
convert.stdout.pipe(res);
convert.stderr.pipe(process.stderr);
}).listen(8080);
The test image is read from the file-system, I want to alter so that test image is a binary string.
var image = 'some long binray string representing an image.......';
My plan is to store the binary strings in Mongodb and read them of dynamically.
Take a look at the node module node-imagemagick. There is the following example on the module's page to resize and image and write it to a file...
var fs = require('fs');
im.resize({
srcData: fs.readFileSync('kittens.jpg', 'binary'),
width: 256
}, function(err, stdout, stderr){
if (err) throw err
fs.writeFileSync('kittens-resized.jpg', stdout, 'binary');
console.log('resized kittens.jpg to fit within 256x256px')
});
You can alter this code to do the following...
var mime = require('mime') // Get mime type based on file extension. use "npm install mime"
, fs = require('fs')
, util = require('util')
, http = require('http')
, im = require('imagemagick');
http.createServer(function (req, res) {
var filePath = 'test.jpg';
fs.stat(filePath, function (err, stat) {
if (err) { throw err; }
fs.readFile(filePath, 'binary', function (err, data) {
if (err) { throw err; }
im.resize({
srcData: data,
width: 256
}, function (err, stdout, stderr) {
if (err) { throw err; }
res.writeHead(200, {
'Content-Type': mime.lookup(filePath),
'Content-Length': stat.size
});
var readStream = fs.createReadStream(filePath);
return util.pump(readStream, res);
});
});
});
}).listen(8080);
Ps. Haven't run the code above yet. Will try do it shortly, but it should give you an idea of how to asynchronously resize and stream a file.
Since you are using spawn() to invoke the ImageMagick command line convert, the normal approach is to write intermediate files to a temp directory where they will get cleaned up either immediately after use or as a scheduled/cron job.
If you want to avoid writing the file to convert, one option to try is base64 encoding your images and using the inline format. This is similar to how images are encoded in some HTML emails or web pages.
inline:{base64_file|data:base64_data}
Inline images let you read an image defined in a special base64 encoding.
NOTE: There is a limit on the size of command-line options you can pass .. Imagemagick docs suggest 5000 bytes. Base64-encoded strings are larger than the original (Wikipedia suggests a rough guide of 137% larger) which could be very limiting unless you're showing thumbnails.
Another ImageMagick format option is ephemeral:
ephemeral:{image_file}
Read and then Delete this image file.
If you want to avoid the I/O passing altogether, you would need a Node.js module that directly integrates a low-level library like ImageMagick or GD rather than wrapping command line tools.
What have you tried so far? You can use GridFS to store the image data and retrieve as a stream from there.. This in C#..Not sure if this helps..
public static void UploadPhoto(string name)
{
var server = MongoServer.Create("mongodb://localhost:27017");
var database = server.GetDatabase("MyDB");
string fileName = name;
using (var fs = new FileStream(fileName, FileMode.Open))
{
var gridFsInfo = database.GridFS.Upload(fs, fileName);
var fileId = gridFsInfo.Id;
//ShowPhoto(filename);
}
}
public static Stream ShowPhoto(string name)
{
var server = MongoServer.Create("mongodb://localhost:27017");
var database = server.GetDatabase("MyDB");
var file = database.GridFS.FindOne(Query.EQ("filename",name));
var stream = file.OpenRead())
var bytes = new byte[stream.Length];
stream.Read(bytes,0,(int)stream.Length);
return stream;
}
You can now use the stream returned by ShowPhoto.

Resources