Receive stream of text from Node.js in AngularJS - node.js

I have an application built using MongoDB as database, Node.js in back-end and AngularJS in front-end.
I need to export to cvs format file some large amount of data from MongoDB and make possible to the user download it. My first implementation retrieved data from MongoDB using stream (Mongoose), saved the file using writable stream and then returned the path to AngularJS download it.
This approach fails for bigger files as it takes longer to be written and the original request in AngularJS times out. I changed the back-end to, instead of writing the file to disk, stream the data to the response so AngularJS can receive them by chunks and download the file.
Node.js
router.get('/log/export', async(req: bloo.BlooRequest, res: Response) => {
res.setHeader('Content-Type', 'text/csv');
res.setHeader('Content-Disposition', 'attachment; filename=\"' + 'download-' + Date.now() + '.csv\"');
res.setHeader('Cache-Control', 'no-cache');
res.setHeader('Pragma', 'no-cache');
const query = req.query;
const service = new LogService();
service
.exportUsageLog(req.user, query)
.pipe(res);
});
public exportUsageLog(user: User, query: any) {
let criteria = this.buildQuery(query);
let stream = this.dao.getLogs(criteria);
return stream
.pipe(treeTransform)
.pipe(csvTransform);
}
public getLogs(criteria: any): mongoose.QueryStream {
return this.logs
.aggregate([
{
'$match': criteria
}, {
'$lookup': {
'from': 'users',
'localField': 'username',
'foreignField': '_id',
'as': 'user'
}
}
])
.cursor({})
.exec()
.stream();
}
This Node.js implementation works as I can see the data using Postman application.
My issue is how to receive these chunks in AngularJS, write them to the file and make the download starts in client side after changed the back-end to stream the data instead of only returning the file path.
My current AngularJS code:
cntrl.export = () => {
LogService
.exportLog(cntrl.query)
.then((res: any) => {
if (res.data) {
console.dir(res.data);
let file = res.data;
let host = location.protocol + "//" + window.location.host.replace("www.", "");
let path = file.slice(7, 32);
let fileName = file.slice(32);
let URI = host + "/"+ path + fileName;
let link = document.createElement("a");
link.href = URI;
link.click();
}
})
.catch((err: any) => console.error(err));
}
angular
.module('log', [])
.factory('LogService', LogService);
function LogService(constants, $resource, $http) {
const
ExportLog = $resource(constants.api + 'modules/log/export');
exportLog(query: any) {
return ExportLog.get(query, res => res).$promise;
},
};
}
With the above code, "res" is undefined.
How could I achieve that?
Thanks in advance.

Related

Delivering image from S3 to React client via Context API and Express server

I'm trying to download a photo from an AWS S3 bucket via an express server to serve to a react app but I'm not having much luck. Here are my (unsuccessful) attempts so far.
The Workflow is as follows:
Client requests photo after retrieving key from database via Context API
Request sent to express server route (important so as to hide the true location from the client)
Express server route requests blob file from AWS S3 bucket
Express server parses image to base64 and serves to client
Client updates state with new image
React Client
const [profilePic, setProfilePic] = useState('');
useEffect(() => {
await actions.getMediaSource(tempPhoto.key)
.then(resp => {
console.log('server resp: ', resp.data.data.newTest) // returns ����\u0000�\u0000\b\u0006\
const url = window.URL || window.webkitURL;
const blobUrl = url.createObjectURL(resp.data.data.newTest);
console.log("blob ", blobUrl);
setProfilePic({ ...profilePic, image : resp.data.data.newTest });
})
.catch(err => errors.push(err));
}
Context API - just axios wrapped into its own library
getMediaContents = async ( key ) => {
return await this.API.call(`http://localhost:5000/${MEDIA}/mediaitem/${key}`, "GET", null, true, this.state.accessToken, null);
}
Express server route
router.get("/mediaitem/:key", async (req, res, next) => {
try{
const { key } = req.params;
// Attempt 1 was to try with s3.getObject(downloadParams).createReadStream();
const readStream = getFileStream(key);
readStream.pipe(res);
// Attempt 2 - attempt to convert response to base 64 encoding
var data = await getFileStream(key);
var test = data.Body.toString("utf-8");
var container = '';
if ( data.Body ) {
container = data.Body.toString("utf-8");
} else {
container = undefined;
}
var buffer = (new Buffer.from(container));
var test = buffer.toString("base64");
require('fs').writeFileSync('../uploads', test); // it never wrote to this directory
console.log('conversion: ', test); // prints: 77+977+977+977+9AO+/vQAIBgYH - this doesn't look like base64 to me.
delete buffer;
res.status(201).json({ newTest: test });
} catch (err){
next(ApiError.internal(`Unexpected error > mediaData/:id GET -> Error: ${err.message}`));
return;
}
});
AWS S3 Library - I made my own library for using the s3 bucket as I'll need to use more functionality later.
const getFileStream = async (fileKey) => {
const downloadParams = {
Key: fileKey,
Bucket: bucketName
}
// This was attempt 1's return without async in the parameter
return s3.getObject(downloadParams).createReadStream();
// Attempt 2's intention was just to wait for the promise to be fulfilled.
return await s3.getObject(downloadParams).promise();
}
exports.getFileStream = getFileStream;
If you've gotten this far you may have realised that I've tried a couple of things from different sources and documentation but I'm not getting any further. I would really appreciate some pointers and advice on what I'm doing wrong and what I could improve on.
If any further information is needed then just let me know.
Thanks in advance for your time!
Maybe it be useful for you, that's how i get image from S3, and process image on server
Create temporary directory
createTmpDir(): Promise<string> {
return mkdtemp(path.join(os.tmpdir(), 'tmp-'));
}
Gets the file
readStream(path: string) {
return this.s3
.getObject({
Bucket: this.awsConfig.bucketName,
Key: path,
})
.createReadStream();
}
How i process file
async MainMethod(fileName){
const dir = await this.createTmpDir();
const serverPath = path.join(
dir,
fileName
);
await pipeline(
this.readStream(attachent.key),
fs.createWriteStream(serverPath + '.jpg')
);
const createFile= await sharp(serverPath + '.jpg')
.jpeg()
.resize({
width: 640,
fit: sharp.fit.inside,
})
.toFile(serverPath + '.jpeg');
const imageBuffer = fs.readFileSync(serverPath + '.jpeg');
//my manipulations
fs.rmSync(dir, { recursive: true, force: true }); //delete temporary folder
}

Import large pdf files to be indexed to Elastic Search

I am trying to large pdf files to elastic search to index them.
uploadPDFDocument: async (req, res, next) => {
try {
let data = req.body;
let client = await cloudSearchController.getElasticSearchClient();
const documentData = await fs.readFile("./large.pdf");
const encodedData = Buffer.from(documentData).toString('base64');
let document = {
id: 'my_id_7',
index: 'my-index-000001',
pipeline: 'attachment',
timeout: '5m',
body: {
data: encodedData
}
}
let response = await client.create(document);
console.log(response);
return res.status(200).send(response);
return true;
} catch (error) {
console.log(error.stack);
return next(error);
}
},
The above code works for small pdf files and I am able extract data from it and index it.
But for large pdf files I get timeout exception.
Is there any other way to this without time out issue?
I have read about fscrawler, filebeats and logstash but they all deal with logs not pdf files.

Why my react front end does not want to download my file sent from my express back end?

hope you can help me on this one!
Here is the situation: I want to download a file from my React front end by sending a request to a certain endpoint on my express back end.
Here is my controller for this route.
I build a query, parse the results to generate a csv file and send back that file.
When I console log the response on the front end side, the data is there, it goes through; however, no dialog open allowing the client to download the file on local disk.
module.exports.downloadFile = async (req, res) => {
const sql = await buildQuery(req.query, 'members', connection)
// Select the wanted data from the database
connection.query(sql, (err, results, fields) => {
if (err) throw err;
// Convert the json into csv
try{
const csv = parse(results);
// Save the file on server
fs.writeFileSync(__dirname + '/export.csv', csv)
res.setHeader('Content-disposition', 'attachment; filename=export.csv');
res.download(__dirname + '/export.csv');
} catch (err){
console.error(err)
}
// Reply with the csv file
// Delete the file
})
}
Follow one of these functions as an example to your client side code:
Async:
export const download = (url, filename) => {
fetch(url, {
mode: 'no-cors'
/*
* ALTERNATIVE MODE {
mode: 'cors'
}
*
*/
}).then((transfer) => {
return transfer.blob(); // RETURN DATA TRANSFERED AS BLOB
}).then((bytes) => {
let elm = document.createElement('a'); // CREATE A LINK ELEMENT IN DOM
elm.href = URL.createObjectURL(bytes); // SET LINK ELEMENTS CONTENTS
elm.setAttribute('download', filename); // SET ELEMENT CREATED 'ATTRIBUTE' TO DOWNLOAD, FILENAME PARAM AUTOMATICALLY
elm.click(); // TRIGGER ELEMENT TO DOWNLOAD
elm.remove();
}).catch((error) => {
console.log(error); // OUTPUT ERRORS, SUCH AS CORS WHEN TESTING NON LOCALLY
})
}
Sync:
export const download = async (url, filename) => {
let response = await fetch(url, {
mode: 'no-cors'
/*
* ALTERNATIVE MODE {
mode: 'cors'
}
*
*/
});
try {
let data = await response.blob();
let elm = document.createElement('a'); // CREATE A LINK ELEMENT IN DOM
elm.href = URL.createObjectURL(data); // SET LINK ELEMENTS CONTENTS
elm.setAttribute('download', filename); // SET ELEMENT CREATED 'ATTRIBUTE' TO DOWNLOAD, FILENAME PARAM AUTOMATICALLY
elm.click(); // TRIGGER ELEMENT TO DOWNLOAD
elm.remove();
}
catch(err) {
console.log(err);
}
}

Returning result of an async operation with a Node.js web server

I'm using Express to build a web API. In the following example, SVG data is converted to PNG and uploaded to S3.
const svg2png = require("svg2png");
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
app.post('/svg_to_png', function (req, res) {
let params = req.body
// STEP 1: Convert SVG to PNG:
var outputBuffer = svg2png.sync(params.svg_data, {});
// STEP 2: Upload to S3:
let s3_params = {
Bucket:params.bucket,
Key:params.key,
Body:outputBuffer,
ContentType:'image/png',
ContentDisposition:'inline',
ACL: 'public-read'
}
result = s3.putObject(s3_params,function(err,data){
if (err){
return err;
}
return 'success';
});
// Return Image URL:
let image_url = 'https://s3.amazonaws.com/' + params.bucket + '/' + params.key
res.send(image_url)
})
I want the API to respond with the URL of the converted image, which the requesting client can then immediately download. The problem is, the S3 upload operation is async, and so when the response is delivered, the image does not yet exist at the URL location, forcing the client to poll for its existence.
Is there a way to get the web server to respond only once the S3 upload has completed?
What about something like this :
const putObjPromise = s3.putObject(params).promise();
putObjPromise
.then(data => {
// Return the URL here.
})
.catch(err => console.log(err))
AWS has this doc for Promises : https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/using-promises.html
Hope this helps.
as #Brandon mentioned, you can return the response once the s3 callback is completed. You can also use s3.putObject(params).promise(). I prefer this since it improves readability.
app.post('/svg_to_png', async function (req, res) {
let params = req.body
...
// STEP 2: Upload to S3:
let params = {
...
}
try {
const result = await s3.putObject(params).promise();
// Return Image URL:
// image_url = "https://s3.amazonaws.com/' + params.bucket + '/' + params.key
// res.body(....).end()
} catch(err) {
// return error response
}
})

getting exceljs workbookdata created in nodejs to saveAs in the client (SOLVED)

I have an angular + node application that has the ability nto download excel files rendered using the exceljs package.
All the work (except for getting the data for the excel) is done throught the client side. The problem is that the browser couldn't handle such amount of data.
What I'm trying to do now is basically do all the work in the server and the client should get the data as buffer array [buffer] and save it.
This my code which worked: (below you can see the fixed version)
Component :
//The this.srv.getExcel() only return observable of data returned from the DB
this.srv.getExcel().subscribe(result =>
{
let workbook = new Workbook();
workbook.addWorksheet('sheet1');
result.forEach(dataItem => worksheet.addRow(Object.keys(dataItem).map(di => dataItem[di]))); //populating the excel
workbook.xlsx.writeBuffer().then((data) =>
{
const data: Blob = new Blob([data], {type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet;charset=UTF-8'});
FileSaver.saveAs(data, 'excelFile.xlsx');
});
})
Now - Trying to convert it (SOLVED):
Component:
this.nodeSrv.getExcel(request, fileName).subscribe(result =>
{
const data: Blob = new Blob([request], {type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet;charset=UTF-8'});
FileSaver.saveAs(data, fileName + '.xlsx');
},
error => { debugger; this.loading = false; }
)
service with http to the end point on the server:
getExcel(request, fileName)
{
var path = reportsUrl.GetVotingBoxListForExcel;
const options = { withCredentials: true };
return this.http.post<any>(path, {request: request, reportName : fileName }, options);
}
This is the main change - most of te work is in the server - This is the nodeSrv:
router:
const express = require('express');
const router = express.Router();
router.use(req, res, next) =>
{
//The GetDataForExcel is the same as this.srv.getExcel() only return promise of data returned from the DB
return DB.GetDataForExcel(req.body.request).then((dataResult) => {
let reportTypeNameForExcel = req.body.reportName ? req.body.reportName : '';
return excel.createExcel(res, dataResult, reportTypeNameForExcel);
}).catch((err) => {
next({
details: err
})
});
})
module.exports = router;
This is the excel.createExcel, something is probably wrong here
createExcel : function(res, dataResult, reportTypeNameForExcel)
{
let workbook = new Workbook();
workbook.addWorksheet('sheet1');
dataResult.forEach(dataItem => worksheet.addRow(Object.keys(dataItem).map(di => dataItem[di]))); //populating the excel
res.setHeader('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
res.setHeader("Content-Disposition", "attachment; filename=" + "Report.xlsx");
workbook.xlsx.write(res).then(() =>
{
res.end();
})
}
The code above is already fixed - solved

Resources