Download large csv file using json2csv in node js - node.js

i am using json2csv package for downloading my data it is working fine for small data. but if the records are more then 300 values it gets crashed. here is my code
const csvString = json2csv.parse(responseData);
res.setHeader('Content-disposition', 'attachment; filename=shifts-report.csv');
res.set('Content-Type', 'text/csv');
res.status(200).send(csvString);
this code is working perfectly fine on small data how can i stream data when there is large amount of data using the same approach that i followed.
i am trying something like this but it gives me an error that cannot set the headers.
const headers = {
'Content-type': 'text/csv',
'Transfer-Encoding': 'chunked',
'Content-Disposition': 'attachment; filename="file.csv"'
};
res.writeHead(200, headers);
res.flushHeaders();
const stream = new Writable({
write(chunk, encoding, callback) {
res.write(chunk);
callback();
}
});
try {
stream.write(file, 'utf-8');
} catch (error) {
console.log('error', error);
}
}
res.end();

You should use Json2CSV stream instead.
npm install json2csv-stream
const fs = require('fs');
const MyStream = require('json2csv-stream');
// create the one-time-use transform stream
const parser = new MyStream();
// create the read and write streams
const reader = fs.createReadStream('data.json');
const writer = fs.createWriteStream('out.csv');
//You can use writer to write it to the FS
// or can stream the content to the response object
// however, you need to write this code in any route handler
// which is sending the CSV data back to the user
reader.pipe(parser).pipe(res);
//reader.pipe(parser).pipe(writer);
For more details check here.

Related

I want to download a zip file in MEAN stack

Here is frontend code (Angular):-
download(user) {
this.clientAPI.get(Endpoints.DOWNLOAD).toPromise()
.then(res => {
let blob = new Blob([new Uint8Array(res.file)], { type: 'application/zip' });
console.log('blob',blob);
let a = document.createElement('a');
a.href = (URL.createObjectURL(blob));
a.download = res.filename;
document.body.appendChild(a);
a.click();
a.remove();
// this.downloadComplete(user);
})
.catch(err => console.error("download error = ", err))
}
Here is my backend code (Node Js):-
exports.download = function (req, res) {
let file = process.env.NOCE_ENV === 'local' ? `${process.cwd()}/server/downloads/eFAST-Release-20.5.17.19.zip` :
`${process.cwd()}/server/downloads/eFAST-Release-20.5.17.19.zip`;
let filename = path.basename(file);
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-type', "application/zip");
let filestream = fs.createReadStream(file);
// console.log('filestream',filenter code hereestream)
res.jsonp({ filename: filename, file: filestream });
};
I am able to download the file but that is not in zip format . that is in .txt format and zero byte.
Please have a look and let me know hpw can i do this ?
You can use res.download("link_to_file") in express to download it.
More info here.
i used res.download but how could i manage this in angular ?
I'm not sure about angular but I think you can use plain js here to navigate to new tab where download will start.
window.open('https://YOUR-LINK-TO-FILE/download','_blank')
By the way, you have typo: process.env.NOCE_ENV
You need to pipe the filestream (from fs.createReadStream()) into res (the response). If you read the nodejs docs about stream and http, one will notice that res is actually also a Writable stream that you can pipe into like:
// set status code
res.setStatus(200); // 200 means ok
// set headers beforehand
// octet-stream means binary
res.setHeader('content-type', 'application/octet-stream');
filestream.pipe(res);
Streams are what make NodeJS powerful and efficient. It's an event-based emitter that passes chunks of data at a threshold limit by a configurable highWaterMark property (I think 16KB per chunk). You normally pipe a Readable into a Writable stream (in your case a readable file stream into a writable http response stream). You can read more about them here: https://nodejs.org/api/stream.html

Attach two listeners to single axios stream

I am trying to fetch pdf url as stream from axios. I need to further upload that file to another location and return the hash of the uploaded file. I have third party function which accepts the stream, and upload file to target location. How can I use same stream to get the hash of the file?
I am trying to run below code:
const getFileStream = await axios.get<ReadStream>(externalUrl, {
responseType: "stream"
});
const hashStream = crypto.createHash("md5");
hashStream.setEncoding("hex");
const pHash = new Promise<string>(resolve => {
getFileStream.data.on("finish", () => {
resolve(hashStream.read());
});
});
const pUploadedFile = externalUploader({
stream: () => getFileStream.data
});
getFileStream.data.pipe(hashStream);
const [hash, uploadedFile] = await Promise.all([pHash, pUploadedFile]);
return { hash, id: uploadedFile.id };
After running this code, when I download the same pdf, I am getting corrupted file
You can reuse the same axios getFileStream.data to pipe to multiple sinks as long as they are consumed simultaneously.
Below is an example of downloading a file using an axios stream and "concurrently" calculating the MD5 checksum of the file while uploading it to a remote server.
The example will output stdout:
Incoming file checksum: 82c12f208ea18bbeed2d25170f3669a5
File uploaded. Awaiting server response...
File uploaded. Done.
Working example:
const { Writable, Readable, Transform, pipeline } = require('stream');
const crypto = require('crypto');
const https = require('https');
const axios = require('axios');
(async ()=>{
// Create an axios stream to fetch the file
const axiosStream = await axios.get('https://upload.wikimedia.org/wikipedia/commons/thumb/8/86/Map_icon.svg/128px-Map_icon.svg.png', {
responseType: "stream"
});
// To re-upload the file to a remote server, we can use multipart/form-data which will require a boundary key
const key = crypto.randomBytes(16).toString('hex');
// Create a request to stream the file as multipart/form-data to another server
const req = https.request({
hostname: 'postman-echo.com',
path: '/post',
method: 'POST',
headers: {
'content-type': `multipart/form-data; boundary=--${key}`,
'transfer-encoding': 'chunked'
}
});
// Create a promise that will be resolved/rejected when the remote server has completed the HTTP(S) request
const uploadRequestPromise = new Promise(resolve => req.once('response', (incomingMessage) => {
incomingMessage.resume(); // prevent response data from queuing up in memory
incomingMessage.on('end', () => {
if(incomingMessage.statusCode === 200){
resolve();
}
else {
reject(new Error(`Received status code ${incomingMessage.statusCode}`))
}
});
}));
// Construct the multipart/form-data delimiters
const multipartPrefix = `\r\n----${key}\r\n` +
'Content-Disposition: form-data; filename="cool-name.png"\r\n' +
'Content-Type: image/png\r\n' +
'\r\n';
const multipartSuffix = `\r\n----${key}--`;
// Write the beginning of a multipart/form-data request before streaming the file content
req.write(multipartPrefix);
// Create a promise that will be fulfilled when the file has finished uploading
const uploadStreamFinishedPromise = new Promise(resolve => {
pipeline(
// Use the axios request as a stream source
axiosStream.data,
// Piggyback a nodejs Transform stream because of the convenient flush() call that can
// add the multipart/form-data suffix
new Transform({
objectMode: false,
transform( chunk, encoding, next ){
next( null, chunk );
},
flush( next ){
this.push( multipartSuffix );
next();
}
}),
// Write the streamed data to a remote server
req,
// This callback is executed when all data from the stream pipe has been processed
(error) => {
if( error ){
reject( error );
}
else {
resolve();
}
}
)
});
// Create a MD5 stream hasher
const hasher = crypto.createHash("md5");
// Create a promise that will be resolved when the hash function has processed all the stream
// data
const hashPromise = new Promise(resolve => pipeline(
// Use the axios request as a stream source.
// Note that it's OK to use the same stream to pipe into multiple sinks. In this case, we're
// using the same axios stream for both calculating the haas, and uploading the file above
axiosStream.data,
// The has function will process stream data
hasher,
// This callback is executed when all data from the stream pipe has been processed
(error) => {
if( error ){
reject( error );
}
else {
resolve();
}
}
));
/**
* Note that there are no 'awaits' before both stream sinks have been established. That is
* important since we want both sinks to process data from the beginning of stream
*/
// We must wait to call the hash function's digest() until all the data has been processed
await hashPromise;
const hash = hasher.digest("hex");
console.log("Incoming file checksum:", hash);
await uploadStreamFinishedPromise;
console.log("File uploaded. Awaiting server response...");
await uploadRequestPromise;
console.log("File uploaded. Done.");
})()
.catch( console.error );

How to save my cam stream in my server realtime node js?

how can I save my chunks of streams which converted into blobs in my node js server real-time
client.js | I am my cam stream as binary to my node js server
handleBlobs = async (blob) => {
let arrayBuffer = await new Response(blob).arrayBuffer()
let binary = new Uint8Array(arrayBuffer)
this.postBlob(binary)
};
postBlob = blob => {
axios.post('/api',{blob})
.then(res => {
console.log(res)
})
};
server.js
app.post('/api', (req, res) => {
console.log(req.body)
});
how can I store the incoming blobs or binary into one video file at the end of video recording completion.
This appears to be a duplicate of How to concat chunks of incoming binary into video (webm) file node js?, but it doesn't currently have an accepted answer. I'm copying my answer from that post into this one as well:
I was able to get this working by converting to base64 encoding on the front-end with the FileReader api. On the backend, create a new Buffer from the data chunk sent and write it to a file stream. Some key things with my code sample:
I'm using fetch because I didn't want to pull in axios.
When using fetch, you have to make sure you use bodyParser on the backend
I'm not sure how much data you're collecting in your chunks (i.e. the duration value passed to the start method on the MediaRecorder object), but you'll want to make sure your backend can handle the size of the data chunk coming in. I set mine really high to 50MB, but this may not be necessary.
I never close the write stream explicitly... you could potentially do this in your /final route. Otherwise, createWriteStream defaults to AutoClose, so the node process will do it automatically.
Full working example below:
Front End:
const mediaSource = new MediaSource();
mediaSource.addEventListener('sourceopen', handleSourceOpen, false);
let mediaRecorder;
let sourceBuffer;
function customRecordStream(stream) {
// should actually check to see if the given mimeType is supported on the browser here.
let options = { mimeType: 'video/webm;codecs=vp9' };
recorder = new MediaRecorder(window.stream, options);
recorder.ondataavailable = postBlob
recorder.start(INT_REC)
};
function postBlob(event){
if (event.data && event.data.size > 0) {
sendBlobAsBase64(event.data);
}
}
function handleSourceOpen(event) {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
}
function sendBlobAsBase64(blob) {
const reader = new FileReader();
reader.addEventListener('load', () => {
const dataUrl = reader.result;
const base64EncodedData = dataUrl.split(',')[1];
console.log(base64EncodedData)
sendDataToBackend(base64EncodedData);
});
reader.readAsDataURL(blob);
};
function sendDataToBackend(base64EncodedData) {
const body = JSON.stringify({
data: base64EncodedData
});
fetch('/api', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body
}).then(res => {
return res.json()
}).then(json => console.log(json));
};
Back End:
const fs = require('fs');
const path = require('path');
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const server = require('http').createServer(app);
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json({ limit: "50MB", type:'application/json'}));
app.post('/api', (req, res) => {
try {
const { data } = req.body;
const dataBuffer = new Buffer(data, 'base64');
const fileStream = fs.createWriteStream('finalvideo.webm', {flags: 'a'});
fileStream.write(dataBuffer);
console.log(dataBuffer);
return res.json({gotit: true});
} catch (error) {
console.log(error);
return res.json({gotit: false});
}
});
Without attempting to implement this (Sorry no time right now), I would suggest the following:
Read into Node's Stream API, the express request object is an http.IncomingMessage, which is a Readable Stream. This can be piped in another stream based API. https://nodejs.org/api/stream.html#stream_api_for_stream_consumers
Read into Node's Filesystem API, it contains functions such as fs.createWriteStream that can handle the stream of chunks and append into a file, with a path of your choice. https://nodejs.org/api/fs.html#fs_class_fs_writestream
After completing the stream to file, as long as the filename has the correct extension, the file should be playable because the Buffer sent across the browser is just a binary stream. Further reading into Node's Buffer API will be worth your time.
https://nodejs.org/api/buffer.html#buffer_buffer

upload binary file to redmine with node

I try to upload a file to redmine with node, I can upload and attach text files, but when I try to upload a binary file I get the token but the file doesn't work. I tried with json, xml and binary, ascii, base64 encoding.
I want upload binary files because I'm doing end to end test testing I want open Issues with screenshots, and upload a report.
I'm using node-rest-client for service calling
Could someone give me any suggestion to fix this problem?
Thanks,
I define the class RMClient
var Client = require('node-rest-client').Client;
var Q = require('q');
var RMClient = function(baseUri, apiToken){
this._apiToken = apiToken;
var client = new Client();
client.registerMethod('openIssue', baseUri+'/issues.json', 'POST');
client.registerMethod('uploadFile', baseUri+'/uploads.json', 'POST');
client.registerMethod('getIssues', baseUri+'/issues.json', 'GET');
this._client = client;
};
option 1:
var deferred = Q.defer();
var file fs.readFileSync(filePath);
//code for sending file to redmine uploads.json
return deferred.promise;
Option 2
var deferred = Q.defer();
var rs = fs.createReadStream(filePath, {'flags': 'r', 'encoding': null, 'autoClose': true});
var size = fs.statSync(filePath).size;
var file = '';
rs.on('error', function(err){
deferred.reject(err);
});
rs.on('data', function(chunk){ file += chunk; });
rs.on('end', function(){
//code for sending file to redmine uploads.json
});
return deferred.promise;
Code that I use to upload the file:
try{
if(!file){
throw new Error('File must\'nt be void');
}
var rmc = new RMClient(myRMURI, myAPItoken);
var headers = {
'X-Redmine-API-Key': rmc._apiToken,
'Content-Type': 'application/octet-stream',
'Accept':'application/json',
'Content-Length': size
};
var args = {
'data':file,
'headers': headers
};
if(parameters){
args.parameters = parameters;
}
rmc._client.methods.uploadFile(args, function(data, response){
if(response.statusCode != 201){
var err = new Error(response.statusMessage);
deferred.reject(err);
return;
}
var attach = JSON.parse(data);
console.log(attach);
if(data.errors){
var msg = ''.concat.apply('', attach.errors.map(function(item, i){
return ''.concat(i+1,'- ',item,(i+1<attach.errors.length)?'\n':'');
}));
console.error(msg);
deferred.reject(Error(msg));
}else{
deferred.resolve(attach.upload.token);
}
});
}catch(err){
console.error(err);
deferred.reject(err);
}
I faced the same issue and solved it this way:
Use "multer";
When you have an uploaded file, make a request using node "request" module, with req.file.buffer as body.
Then uploading files using the Rest API, you have to send the raw file contents in the request body, typically with Content-Type: application/octet-stream. The uploaded file doesn't need any further encoding or wrapping, esp. not as multipart/form-data, JSON or XML.
The response of the POST request to /uploads.xml contains the token to attach the attachment to other objects in Redmine.

Node - CSV is not working

I am using nodejs v0.10.26 and expressjs.
I want to implement data export in CSV format functionality in application and for this i am using node-csv 0.3.7
JS
var csv = require('csv');
res.setHeader('Content-disposition', 'attachment; filename=Report.csv');
res.writeHead(200, {'Content-Type': 'text/csv'});
csv().from(callBack).to(res);
But it is not prompting any CSV file, i am getting just text data in service response.
Can anyone tell me what is wrong in my code? I want CSV file prompt in this scenario.
Updates
Here callBack = JSON object which contains data
res = response
Try this:
var csv = require('csv');
res.setHeader('Content-disposition', 'attachment; filename=Report.csv');
res.writeHead(200, {'Content-Type': 'text/csv'});
csv.parse(callBack, function(err, output) {
if(err){
throw err;
}
res.write(output);
});
I tested the csv.parse function with this code:
var parse = require('csv-parse');
var data = '"1","2","3","4"\n"a","b","c","d"';
parse(data, function(err, output) {
if(err){
throw err;
}
console.log(output);
});
Also, as a side note, I'd recommend not calling a data-containing variable callBack, as convention tends to name callback functions callback. It will work your way, but it will confuse anyone reading your code.

Resources