Why is CreateReadStream not firing error event? - node.js

I am attempting to do some error handling for a csv file. When changing the name of the file to something that does not exists,
Error: ENOENT: no such file or directory, open 'testdat.csv' is displayed but the on error event is not fired.
Here is the current implementation:
const csv = require('csv-parser');
const fs = require('fs');
fs.createReadStream('testdat.csv')
.pipe(csv())
.on('open', () => {
})
.on('headers', (headers) => {
validateHeader(headers);
})
.on('data', (data) => {
temp.push(data);
})
.on('end', () => {
validateData(temp);
})
.on('error', (err) => {
console.log(err);
});

You need to attach the error event handler before piping.
const csv = require('csv-parser');
const fs = require('fs');
var readable = fs.createReadStream('testdat.csv');
readable.on('error', (err) => {
console.log(err);
});
// pipe function returns different stream
var writable = readable.pipe(csv())
writable.on('error', (err) => {
console.log(err);
});

Related

Write excel file to file stream with nodejs and download it with api call in browser

I am stuck at problem on how to write excel file to filestream and download it in browser.. I can only create new file in server but this is not what I want.. I don't want to create in on server (ok if it must be created then i also want to delete it when user downloads it in browser).
But I can't achieve the download..
So the general idea is that I read the csv file, than parse the data.
I also read a template Excele file which I overwrite and write it to the file stream. When I call the get API, then I can the download starts (I will integrate it in Angular app later)..
I am using Exceljs npm package.
I don't have any errors but code is not working as I want
I uploaded whole code in github so you can easily see the code and duplicate my code.
https://github.com/zigax1/mean-generate-download-excel/tree/master
My excel-builder script:
export const generateExcel = async (req: Request, res: Response) => {
try {
await csvParse();
res.setHeader("Content-disposition", `attachment;`);
res.contentType(
"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"
);
return res.status(200).json("Success");
} catch (err) {
return res.status(500).json("False");
}
};
const csvParse = async () => {
fs.createReadStream("./content/TestCsv.csv")
.pipe(csv.parse())
.on("error", (error: any) => console.log("Error"))
.on("data", (row: any) => {
let line: any = String(row);
line = line.split(";");
//let parsedData = line[0];
let parsedData = line;
allParsedData.push(parsedData);
})
.on("end", (rowCount: any) => {
let test = allParsedData.toString();
generateFile(test);
});
};
const generateFile = (data: any) => {
return new Promise<fs.ReadStream>((resolve, reject) => {
const workbook = new Excel.Workbook();
workbook.xlsx.readFile("./utilities/template.xlsx").then(() => {
workbook.xlsx.writeFile("./content/Test.xlsx").then(
() => {
let stream = fs.createReadStream("./content/Test.xlsx");
stream.on("close", () => {
fs.unlink("./content/Test.xlsx", (error) => {
if (error) {
throw error;
}
});
});
resolve(stream);
},
(err) => {
throw err;
}
);
});
});
};
Thanks to everyone!
const csv = require('fast-csv');
const fs = require('fs');
function exportCSVFile(res, path, data) {
const ws = fs.createWriteStream(path + ".csv");
ws.on("finish", function () {
res.download(path + ".csv", () => {
fs.unlinkSync(path + ".csv");
});
});
csv.write(data, {headers: true}).pipe(ws);
}
You use this export csv function for your response

how to access a local variable value outside the scope of its function in nodejs

I want to compare the data of two files and for that, I'm reading that file using the fs module but since I want to compare the values so I thought to store the value in an external variable but when I do console.log(budget_details) I get nothing in console. Please someone help. Please point me out if my approach is wrong and if we don't need to do that in nodejs. I'm new to nodejs.
import csv from 'csv-parser'
import fs from 'fs';
let budget_details
const budgetProcessing = (budget_file_path) => {
try{
fs.createReadStream(budget_file_path)
.pipe(csv())
.on('data', (row) => {
budget_details = row
})
.on('end', () => {
console.log('CSV file successfully processed');
});
}
catch(error){
console.log(error)
}
}
budgetProcessing('budget.csv')
console.log(budget_details)
Let's first explain why you don't get the expected result, it doesnt have to do with scope actually:
import csv from 'csv-parser'
import fs from 'fs';
let budget_details
const budgetProcessing = (budget_file_path) => {
try{
fs.createReadStream(budget_file_path)
.pipe(csv())
.on('data', (row) => {
budget_details = row
})
.on('end', () => {
console.log('CSV file successfully processed');
});
}
catch(error){
console.log(error)
}
}
budgetProcessing('budget.csv')
console.log(budget_details)
fs.createReadStream is not itslef exactly asynchronous but then we pipe the returned stream to csv-parser which does event based parsing, so even if you call budgetProcessing before the console.log(budget_details) the stream reading has most likely not runned yet and budget_details is still undefined.
To fix this, you could move this console.log(budget_details) where it is set like so:
let budget_details
const budgetProcessing = (budget_file_path) => {
try{
fs.createReadStream(budget_file_path)
.pipe(csv())
.on('data', (row) => {
budget_details = row
console.log(budget_details)
})
.on('end', () => {
console.log('CSV file successfully processed');
});
}
catch(error){
console.log(error)
}
}
budgetProcessing('budget.csv')
but then the variable itself wouldnt serve any real purpose so instead you could do this:
const budgetProcessing = (budget_file_path, callback) => {
try{
fs.createReadStream(budget_file_path)
.pipe(csv())
.on('data', (row) => {
callback(row)
})
.on('end', () => {
console.log('CSV file successfully processed');
});
}
catch(error){
console.log(error)
}
}
budgetProcessing('budget.csv', (budget_details) => {
console.log(budget_details) // or anything with budget_details
})
Lastly, I want to make clear that the callback will be called for each row of the csv as specified in csv-parser's documentation
your code is not asynchronous. Anything with 'on', which takes a function, would indicate that it is event driven. You need something like:
import csv from 'csv-parser'
import fs from 'fs';
let budget_details
const budgetProcessing = (budget_file_path) => new Promise((resolve, reject) => {
try {
fs.createReadStream(budget_file_path)
.pipe(csv())
.on('data', (row) => {
budget_details = row
})
.on('end', () => {
console.log('CSV file successfully processed');
resolve()
});
} catch (error) {
console.log(error)
reject(error)
}
})
budgetProcessing('budget.csv')
.then(() => console.log(budget_details))

How to make express download link with GridFsBucket?

As the title says, how do you make a direct download link with a file from mongoDB(GridFsBucket) using express?
The file should be downloadable from memory, as i dont want to save it temporarily on the server.
I have this method:
async function downloadFileFromDB(fileId) {
var gridfsbucket = new mongoose.mongo.GridFSBucket(mongoose.connection.db, {
chunkSizeBytes: 1024,
bucketName: 'filesBucket'
});
try {
const stream = gridfsbucket.openDownloadStream(fileId)
const fileBuffer = Buffer.from(stream)
return fileBuffer
} catch (err) {
stream.on('error', () => {
console.log("Some error occurred in download:" + error);
})
console.log(err);
}
}
And this route:
router.get('/download-file', async (req,res) => {
const fileId = req.query.fileId
const ObjectFileId = new ObjectId(fileId)
const fileBuffer = await fileFacade.downloadFileFromDB(ObjectFileId)
res.download(fileBuffer)
})
But res.download wants a path and not a buffer. Aswell im not sure i can make a buffer directly from the openDownloadStream method.
Can anyone help?
I believe you need to write the data to your res object. I accomplished this like:
const readStream = gridfs.openDownloadStreamByName(filename);
readStream.on("data", (chunk) => {
res.write(chunk);
});
readStream.on("end", () => {
res.status(200).end();
mongoClient.close();
});
readStream.on("error", (err) => {
console.log(err);
res.status(500).send(err);
});
So, you may just have to do:
res.write(fileBuffer).end();
//// Instead of doing:
// res.download(fileBuffer);

How to create a file stream and write to the stream asynchronously?

I am new to TypeScript/JavaScript and Node.
Now I am trying to create a file stream and write "Hello!" to the stream asynchronously.
#!/usr/bin/env node
import fs from 'fs';
function createStream(filePath: string): Promise<fs.WriteStream> {
return new Promise<fs.WriteStream>((resolve, reject) => {
const out = fs.createWriteStream(filePath);
out.on('close', () => {
console.log(filePath + ' closed');
resolve(out);
});
out.on('error', (err: any) => {
console.log(filePath + ' ' + err);
reject(err);
});
});
}
createStream('/tmp/test.txt').then((out:fs.WriteStream) => {
console.log(out);
out.write('Hello!');
out.end();
})
This code does create /tmp/test.txt but prints out nothing and the file is empty.
What is the problem with this code ?
You don't need to resolve a promise with the fs.WriteStream since its creation is synchronous. Just call fs.createWriteStream() directly and pass the instance to your function to create a promise that settles when the stream closes or errors:
#!/usr/bin/env node
import fs from 'fs';
import stream from 'stream';
function promisify(s: stream.Stream): Promise<void> {
return new Promise<void>((resolve, reject) => {
const onClose = () => {
s.off('error', onError);
resolve();
};
const onError = (error: Error) => {
s.off('close', onClose);
reject(error);
};
s.once('close', onClose);
s.once('error', onError);
});
}
const out = fs.createWriteStream('/tmp/test.txt');
promisify(out).then(() => {
console.log('Done');
});
out.write('Hello!');
out.end();

Memory leak with csv-parser & promises

i'm having trouble parsing a 800k lines CSV file line-by-line using the npm library 'csv-parser' and promises.
Here is what i am doing, simply pausing every row and resuming after the user has been upserted in the database.
At ~= 3000 users, more than 1gb of RAM is used and a memory heap usage exception appears.
const csv = require('csv-parser');
const fs = require('fs');
const path = require('path');
function parseData() {
return new Promise((resolve, reject) => {
const stream = fs.createReadStream(filePath)
.pipe(csv(options))
.on('data', row => {
stream.pause();
upsertUser(row)
.then(user => {
stream.resume();
})
.catch(err => {
console.log(err);
stream.resume();
});
})
.on('error', () => reject())
.on('end', () => resolve());
return stream;
});
the upsert function :
function upsertUser(row) {
return user.find({
where: {
mail: row.emailAddress
}
});
edit : here is a picture of the node inspector :

Resources