How to download csv file with koajs - node.js

I'm using koajs as a framework for nodejs. I try to create csv data and response it to client but not working
let fields = ['code', 'status'];
let p = new Promise((resolve, reject) => {
json2csv({data: data, fields: fields }, (err, response) => {
if (err) {
reject(err);
} else {
resolve(response);
}
});
});
return p.then(data => {
let fileName = 'promotioncode-' + moment().unix();
ctx.response.attachment(fileName + '.csv');
ctx.response.type = 'application/ms-excel';
ctx.body = data;
})
The response is plan text data instead of attachment file
Here is response headers
Here is response body

If you would like to send a downloadable file attached to the body you need to create a read stream of the file.
const fs = require('fs');
ctx.set('Content-disposition', `attachment; filename=${result}`);
ctx.statusCode = 200;
ctx.body = fs.createReadStream(result);
Note: in result you have the file path

This worked for me:
ctx.set('Content-disposition', `attachment; filename=${fileName}.csv`);
ctx.statusCode = 200;
ctx.body = data;

Related

incorrect header check while trying to uncompressed s3 object body?

I compress and upload an object to s3 using the follwoing code:
let data: string | Buffer = JSON.stringify(rules);
let contentType = "application/json";
let encoding = null;
let filename = `redirector-rules.json`;
if (format === "gz") {
contentType = "application/gzip";
encoding = "gzip";
filename = `redirector-rules.gz`;
const buf = Buffer.from(data, "utf-8");
data = zlib.gzipSync(buf);
}
// res.end(data);
// return res.status(200).send(data);
await s3.upload(filename, data, contentType, encoding);
I am assuming this is working correctly since when I donwload the result file using aws s3 cp command it works just fine and I am able to uncompress it on my machine. additionally, possibly unrelated fact, if I downlaod via the conole for s3, my system is unable to uncompress it and it possibly corrupt or truncated.
on the other end I have a lambda code that read get the object and attempt to decompress it:
const getRules = async (rulesCommand: GetObjectCommand): Promise<Config> => {
const resp = await fetchRulesFile(rulesCommand);
const data = await parseResponse(resp, rulesCommand);
return data;
};
const fetchRulesFile = async (rulesCommand: GetObjectCommand): Promise<GetObjectCommandOutput> => {
try {
console.log(`Retrieving rules file with name ${rulesCommand.input.Key}`);
const resp = await client.send(rulesCommand);
return resp;
} catch (err) {
throw new Error(`Error retrieving rules file: ${err}`);
}
};
const parseResponse = async (resp: GetObjectCommandOutput, rulesCommand: GetObjectCommand): Promise<Config> => {
const { Body } = resp;
if (!Body) {
throw new Error("No body in response");
}
let data: string = await Body.transformToString();
if (rulesCommand.input.Key?.endsWith(".gz")) {
console.log(`Uncompressing rules file with name ${rulesCommand.input.Key}`);
try {
data = zlib.gunzipSync(data).toString("utf-8");
} catch (err) {
throw new Error(`Error decompressing rules file: ${err}`);
}
}
return JSON.parse(data) as Config;
};
but I keep getting this error: Error: incorrect header check
I resolved the issue by using Readable and streams in the parseResponse function:
const parseResponse = async (
resp: GetObjectCommandOutput,
rulesCommand: GetObjectCommand
): Promise<Config> => {
const { Body } = resp;
if (!Body) {
throw new Error("No body in response");
}
let data = "";
const readableStream = new Readable();
readableStream._read = () => {}; // noop
// #ts-ignore
Body.on("data", (chunk : any) => {
readableStream.push(chunk);
data += chunk;
});
// #ts-ignore
Body.on("end", () => {
readableStream.push(null);
});
const gunzip = zlib.createGunzip();
const result = await new Promise((resolve, reject) => {
let buffer = "";
readableStream.pipe(gunzip);
gunzip.on("data", (chunk) => {
buffer += chunk;
});
gunzip.on("end", () => {
resolve(buffer);
});
gunzip.on("error", reject);
});
return result as Config;
};
I had to add #ts-ignore at Body.on because of type mismatch. But it still worked in the compiled JS handler and fixing it with conversion seemed a bit complex.

How to download file from gitlab synchronously using NodeJS

I need to download a file from a private gitlab server and I need the method to be synchronous. This was by previous async code and it works fine because I was using promises. But I'm having trouble converting it to synchronous. The other posts i've seen on SO either ended up using async code or didn't have options for headers.
const https = require('https');
const fs = require('fs');
const gitlabUrl = 'https://gitlab.custom.private.com';
const gitlabAcessToken = 'xmyPrivateTokenx';
const gLfilePath = '/api/v4/projects/1234/repository/files/FolderOne%2Ftest.txt/raw?ref=main';
const gLfileName='test.txt';
function downloadFileFromGitlab(filePath, fileName) {
return new Promise((resolve, reject) => {
var options = {
path: filePath,
headers: {
'PRIVATE-TOKEN': gitlabAccessToken
}
};
var url = gitlabUrl
var file = fs.createWriteStream(fileName);
const request = https.get(url, options, (response) => {
response.pipe(file);
file.on('finish', () => {
file.close();
resolve();
});
file.on('error', (err) => {
file.close();
reject(err);
});
});
request.on('error', error => {
throw console.error(error);
});
});
}
downloadFileFromGitlab(gLfilePath,gLfileName);
I was able to figure it out using curl
function downloadFileFromGitlab(filePath, fileName) {
let curlCommand = "curl -s " + gitlabUrl + filePath + " -H 'PRIVATE-TOKEN:" + gitlabAccessToken +"'";
let file = child_process.execSync(curlCommand);
fse.writeFileSync(fileName,file);
}

node.js cannot parse json - reason: Unexpected token in JSON at position 0

I've been trying for several days but I can't do the diel json parser at this url http://sdmx.istat.it/SDMXWS/rest/dataflow/IT1//?detail=full&references=none&format=jsonstructure
I can write it to a file and if I try to upload the file to one of the many online tools it tells me that it is valid.
Also the url test in the same online tools gives me valid json.
this is my code what am I wrong?
var express = require('express');
var router = express.Router();
const fetch = require('node-fetch');
const JSONstat = require("jsonstat-toolkit");
const JSONstatUtils = require("jsonstat-suite");
const fs = require('fs');
router.get('/', async(req, res) => {
var ws_entry_point ='http://sdmx.istat.it/SDMXWS/rest';
var resource = 'dataflow';
var agencyID ='IT1';
var detail = 'full';
var references = 'none';
var format = 'jsonstructure';
var queryUrl = ws_entry_point + '/' + resource + '/' + agencyID + '//?detail=' + detail + '&references=' + references + '&format=' + format;
//http://sdmx.istat.it/SDMXWS/rest/dataflow/IT1//?detail=full&references=none&format=jsonstructure
console.log( queryUrl );
fetch(queryUrl, {
method: 'GET',
headers: {
'Content-Type': 'application/json, charset=UTF-8'
}}).then(checkResponseStatus)
.then(async response => {
try {
const data = await response.json()
console.log('response data?', data)
} catch(error) {
console.log('Error happened here!')
console.error(error)
}
}).catch(err => console.log(err));
// fs.writeFile('dataflow.json', res.data, function (err) {
// if (err) return console.log(err);
//
// console.log('write response.text > dataflow.json');
// fs.readFile('dataflow.json', 'utf8', function (err, dataF) {
// if (err) throw err; // we'll not consider error handling for now
// console.log('read < dataflow.json');
// //var obj = JSON.parse(dataF);
// console.log(dataF);
// });
// });
});
function checkResponseStatus(res) {
if(res.ok){
return res
} else {
throw new Error(`The HTTP status of the reponse: ${res.status} (${res.statusText})`);
}
}
any help is appreciated, regards,
Maurizio
The server does not respond with the requested application/json content-type, instead it sends a response with content-type application/vnd.sdmx.structure+json and a response content that contains a json-like string but with some incorrect whitespaces.
You can fix this by using .text() on the response and manually parsing the content after trimming it:
const rawData = await response.text();
const data = JSON.parse(rawData.trim());
console.log('response data?', data)

How to concat chunks of incoming binary into video (webm) file node js?

I am trying to upload chunks of base64 to node js server and save those chunks into one file
let chunks = [];
app.post('/api', (req, res) => {
let {blob} = req.body;
//converting chunks of base64 to buffer
chunks.push(Buffer.from(blob, 'base64'));
res.json({gotit:true})
});
app.post('/finish', (req, res) => {
let buf = Buffer.concat(chunks);
fs.writeFile('finalvideo.webm', buf, (err) => {
console.log('Ahh....', err)
});
console.log('SAVED')
res.json({save:true})
});
Problem with the above code is video is not playable I don't why Am I really doing something wrong and I've also tried writable streams it is not working either
UPDATE - I
Instead of sending blobs I've implemented to send binary but even though I am facing a problem like TypeError: First argument must be a string, Buffer, ArrayBuffer, Array, or array-like object.
client.js
postBlob = async blob => {
let arrayBuffer = await new Response(blob).arrayBuffer();
let binary = new Uint8Array(arrayBuffer)
console.log(binary) // logging typed Uint8Array
axios.post('/api',{binary})
.then(res => {
console.log(res)
})
};
server.js
let chunks = [];
app.post('/api', (req, res) => {
let {binary} = req.body;
let chunkBuff = Buffer.from(binary) // This code throwing Error
chunks.push(chunkBuff);
console.log(chunkBuff)
res.json({gotit:true})
});
//Somehow combine those chunks into one file
app.post('/finish', (req, res) => {
console.log('Combinig the files',chunks.length);
let buf = Buffer.concat(chunks);
console.log(buf) //empty buff
fs.writeFile('save.webm', buf, (err) => {
console.log('Ahh....', err)
});
res.json({save:true})
});
UPDATE - II
I am able to receive the binary chunk and append to a stream but in the final video only first chunk is playing I don't know what happened to other chunks and the video ends.
code
const writeMyStream = fs.createWriteStream(__dirname+'/APPENDED.webm', {flags:'a', encoding:null});
app.post('/api', (req, res) => {
let {binary} = req.body;
let chunkBuff = Buffer.from(new Uint8Array(binary));
writeMyStream.write(chunkBuff);
res.json({gotit:true})
});
UPDATE - III
my client code | Note: I've tried other ways to upload blobs I've commented out
customRecordStream = stream => {
let recorder = new MediaStreamRecorder(stream);
recorder.mimeType = 'video/webm;codecs=vp9';
recorder.ondataavailable = this.postBlob
recorder.start(INT_REC)
};
postBlob = async blob => {
let arrayBuffer = await new Response(blob).arrayBuffer();
let binary = new Uint8Array(arrayBuffer)
axios.post('/api',{binary})
.then(res => {
console.log(res)
})
// let binaryUi8 = new Uint8Array(arrayBuffer);
// let binArr = Array.from(binaryUi8);
// // console.log(new Uint8Array(arrayBuffer))
//
// console.log(blob);
// console.log(binArr)
// let formData = new FormData();
// formData.append('fname', 'test.webm')
// formData.append("file", blob);
//
// console.log(formData,'Checjk Me',blob)
// axios({
// method:'post',
// url:'/api',
// data:formData,
// config: { headers: {'Content-Type': 'multipart/form-data' }}
// }).then(res => {
// console.log(res,'FROM SERBER')
//
// })
//
//
// .then(res => {
// console.log(res)
// })
// this.blobToDataURL(blob, (blobURL) => {
//
// axios.post('/api',{blob:blobURL})
// .then(res => {
// console.log(res)
// })
// })
};
I was able to get this working by converting to base64 encoding on the front-end with the FileReader api. On the backend, create a new Buffer from the data chunk sent and write it to a file stream. Some key things with my code sample:
I'm using fetch because I didn't want to pull in axios.
When using fetch, you have to make sure you use bodyParser on the backend
I'm not sure how much data you're collecting in your chunks (i.e. the duration value passed to the start method on the MediaRecorder object), but you'll want to make sure your backend can handle the size of the data chunk coming in. I set mine really high to 50MB, but this may not be necessary.
I never close the write stream explicitly... you could potentially do this in your /final route. Otherwise, createWriteStream defaults to AutoClose, so the node process will do it automatically.
Full working example below:
Front End:
const mediaSource = new MediaSource();
mediaSource.addEventListener('sourceopen', handleSourceOpen, false);
let mediaRecorder;
let sourceBuffer;
function customRecordStream(stream) {
// should actually check to see if the given mimeType is supported on the browser here.
let options = { mimeType: 'video/webm;codecs=vp9' };
recorder = new MediaRecorder(window.stream, options);
recorder.ondataavailable = postBlob
recorder.start(INT_REC)
};
function postBlob(event){
if (event.data && event.data.size > 0) {
sendBlobAsBase64(event.data);
}
}
function handleSourceOpen(event) {
sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"');
}
function sendBlobAsBase64(blob) {
const reader = new FileReader();
reader.addEventListener('load', () => {
const dataUrl = reader.result;
const base64EncodedData = dataUrl.split(',')[1];
console.log(base64EncodedData)
sendDataToBackend(base64EncodedData);
});
reader.readAsDataURL(blob);
};
function sendDataToBackend(base64EncodedData) {
const body = JSON.stringify({
data: base64EncodedData
});
fetch('/api', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body
}).then(res => {
return res.json()
}).then(json => console.log(json));
};
Back End:
const fs = require('fs');
const path = require('path');
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const server = require('http').createServer(app);
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json({ limit: "50MB", type:'application/json'}));
app.post('/api', (req, res) => {
try {
const { data } = req.body;
const dataBuffer = new Buffer(data, 'base64');
const fileStream = fs.createWriteStream('finalvideo.webm', {flags: 'a'});
fileStream.write(dataBuffer);
console.log(dataBuffer);
return res.json({gotit: true});
} catch (error) {
console.log(error);
return res.json({gotit: false});
}
});
Inspired by #willascend answer:
Backend-side:
app.use(express.raw());
app.post('/video-chunck', (req, res) => {
fs.createWriteStream('myvideo.webm', { flags: 'a' }).write(req.body);
res.sendStatus(200);
});
Frontend-side:
mediaRecorder.ondataavailable = event => {
if (event.data && event.data.size > 0) {
fetch(this.serverUrl + '/video-chunck', {
method: 'POST',
headers: {'Content-Type': 'application/octet-stream'},
body: event.data
});
}
};
My express version is 4.17.1
i faced the same problem today
as a solution in back-end i used fs.appendfile
fs.appendFile(Path, rawData, function (err) {
if (err) throw err;
console.log('Chunck Saved!');
})

Streaming image data from node server results in corrupted file (gridfs-stream)

I decided to post this after extensive searching here (1, 2, 3 ) and here (1, 2) and many, many other related posts. I am loosing hope, but will not give up that easily :)
I'm using multer to upload a PNG image to mongo database:
const storage = new GridFsStorage({
url: 'mongodb://my_database:thisIsfake#hostName/my_database',
file: (req, file) => {
return new Promise((resolve, reject) => {
crypto.randomBytes(16, (err, buf) => { // generating unique names to avoid duplicates
if (err) {
return reject(err);
}
const filename = buf.toString('hex') + path.extname(file.originalname);
const fileInfo = {
filename: filename,
bucketName: 'media',
metadata : {
clientId : req.body.client_id // added metadata to have a reference to the client to whom the image belongs
}
};
resolve(fileInfo);
});
});
}
});
const upload = multer({storage}).single('image');
Then I create a stream and pipe it to response:
loader: function (req, res) {
var conn = mongoose.createConnection('mongodb://my_database:thisIsfake#hostName/my_database');
conn.once('open', function () {
var gfs = Grid(conn.db, mongoose.mongo);
gfs.collection('media');
gfs.files.find({ metadata : {clientId : req.body.id}}).toArray(
(err, files) => {
if (err) throw err;
if (files) {
const readStream = gfs.createReadStream(files[0].filename); //testing only with the first file in the array
console.log(readStream);
res.set('Content-Type', files[0].contentType)
readStream.pipe(res);
}
});
});
}
Postman POST request to end point results in response body being displayed as an image file:
In the front end I pass the response in a File object, read it and save the result in a src attribute of img:
findAfile(){
let Data = {
id: this.$store.state.StorePatient._id,
};
console.log(this.$store.state.StorePatient._id);
visitAxios.post('http://localhost:3000/client/visits/findfile', Data )
.then(res => {
const reader = new FileReader();
let file = new File([res.data],"image.png", {type: "image/png"});
console.log('this is file: ',file);
reader.readAsDataURL(file); // encode a string
reader.onload = function() {
const img = new Image();
img.src = reader.result;
document.getElementById('imgContainer').appendChild(img);
};
})
.catch( err => console.error(err));
}
My File object is similar to the one I get when using input field only bigger:
This is original file:
When inspecting element I see this:
Looks like data URI is where it should be, but it's different from the original image on file input:
Again, when I want to display it through input element:
onFileSelected(event){
this.file = event.target.files[0];
this.fileName = event.target.files[0].name;
const reader = new FileReader();
console.log(this.file);
reader.onload = function() {
const img = new Image();
img.src = reader.result;
document.getElementById('imageContainer').appendChild(img);
};
reader.readAsDataURL(this.file);
}
I get this:
But when reading it from the response, it is corrupted:
Postman gets it right, so there must be something wrong with my front-end code, right? How do I pass this gfs stream to my html?
I managed to make a POST request to fetch an image from MongoDB and save it in the server dir:
const readStream = gfs.createReadStream(files[0].filename);
const wstream = fs.createWriteStream(path.join(__dirname,"uploads", "fileToGet.jpg"));
readStream.pipe(wstream);
Then, I just made a simple GET request by adding an absolute path to the and finally delete the file after successful response:
app.get('/image', function (req, res) {
var file = path.join(dir, 'fileToGet.jpg');
if (file.indexOf(dir + path.sep) !== 0) {
return res.status(403).end('Forbidden');
}
var type = mime[path.extname(file).slice(1)] || 'text/plain';
var s = fs.createReadStream(file);
s.on('open', function () {
res.set('Content-Type', type);
s.pipe(res);
});
s.on('end', function () {
fs.unlink(file, ()=>{
console.log("file deleted");
})
});
s.on('error', function () {
res.set('Content-Type', 'text/plain');
res.status(404).end('Not found');
});

Resources