I want to export my data into csv file so for that purpose i used fast-csv in node js. its working fine my code is
var csv = require("fast-csv");
app.get('/file/exportToExcel', function(req, res) {
var whereCondi = {
'Id': req.query.id,
}
var response = {};
table.find(whereCondi, function(err, tableData) {
if (err) {
response.status = 400;
response.message = err.message;
res.send(response);
} else {
var csvStream = csv.createWriteStream({headers: true}),
writableStream = fs.createWriteStream("code.csv");
writableStream.on("finish", function() {
});
csvStream.pipe(writableStream);
for (var i = 0; i < tableData.length; i++) {
csvStream.write({allcodes: tableData[i].code});
}
csvStream.end();
}
});
});
but the problem is its saving that csv file in my root folder i want to download that csv file when user click on export to excel please help me.
writableStream = fs.createWriteStream("coupons.csv");
This looks to be your problem if I'm understanding you correctly. The current code saves the csv file relative to the app file (basically in the same directory in your case).
Try something like:
writableStream = fs.createWriteStream("./some/directory/coupons.csv");
You should create the csv file in your directory an then delete it in the same way like that
const express = require('express')
const objectstocsv = require('objects-to-csv')
const fs = require("fs")
const app = express()
var data = [
{code: 'CA', name: 'California'},
{code: 'TX', name: 'Texas'},
{code: 'NY', name: 'New York'},
];
const PORT = process.env.PORT || 5000
app.get('/',async(req,res) => {
const csv = new objectstocsv(data);
// Save to file:
await csv.toDisk('./test.csv');
// Download the file
res.download("./test.csv",() => {
//Then delete the csv file in the callback
fs.unlinkSync("./test.csv")
})
})
Very late to the game but wanted to add in case other people were encountering same hurdle. Not sure if this is an ideal solution since I just started learning, but I got around this problem by wrapping the csv creation in an async function, and having that function called when a user clicks on a button.
Essentially, user clicks button > GET request to specific path > export csv and render success page.
index.js or server js file
const exportToCsv = () => {
...some code to get data and generate csv...
};
app.get('/download', async (req, res) => {
exportToCsv();
res.render('<your desired view>');
};
view or html
<button type='submit' onclick='someNavFunction()'>Export</button>
The someNavFunction() can be a helper function to navigate to a new path or some other solution that helps you hit '/download' that you created route for in server file.
This solution worked for me because I wanted to render a success page after download. You can add additional validation to only render if exported successfully etc.
Related
As far as I know (correct me if i'm wrong please) the flow of downloading a file should be that the frontend make a call to an api route and everything else is going on on the server.
My task was to read from firestore and write it to the CSV file, I populated the CSV file with the data and now when I try to send it to the frontend only thing that is in the file after the download it the first line containing headers name and email (the file that was written on my computer is correctly willed with the data). This is my route
import { NextApiHandler } from "next";
import fs from "fs";
import { stringify } from "csv-stringify";
import { firestore } from "../../firestore";
import { unstable_getServerSession } from "next-auth/next";
import { authOptions } from "./auth/[...nextauth]";
const exportFromFirestoreHandler: NextApiHandler = async (req, res) => {
const session = await unstable_getServerSession(req, res, authOptions);
if (!session) {
return res.status(401).json({ message: "You must be authorized here" });
}
const filename = "guestlist.csv";
const writableStream = fs.createWriteStream(filename);
const columns = ["name", "email"];
const stringifier = stringify({ header: true, columns });
const querySnapshot = await firestore.collection("paprockibrzozowski").get();
await querySnapshot.docs.forEach((entry) => {
stringifier.write([entry.data().name, entry.data().email], "utf-8");
});
stringifier.pipe(writableStream);
const csvFile = await fs.promises.readFile(
`${process.cwd()}/${filename}`,
"utf-8"
);
res.status(200).setHeader("Content-Type", "text/csv").send(csvFile);
};
export default exportFromFirestoreHandler;
since I await querySnapshot and await readFile I would expect that the entire content of the file would be sent to the frontend. Can you please tell me what am I doing wrong?
Thanks
If anyone will struggle with this same stuff here is the answer base on the # Nelloverflowc thank you for getting me this far, hoverver files not always were populated with data, at first I tried like so
stringifier.on("close", async () => {
const csvFile = fs.readFileSync(`${process.cwd()}/${filename}`, "utf-8");
res
.status(200)
.setHeader("Content-Type", "text/csv")
.setHeader("Content-Disposition", `attachment; filename=${filename}`)
.send(csvFile);
});
stringifier.end();
the api of https://csv.js.org/ must have changed becuase instead of on.('finish') it is on close now, so reading file sync did the job regarding always getting the file populated with the correct data, but along with it there was an error
API resolved without sending a response for /api/export-from-db, this may result in stalled requests.
the solution to that is to convert file into readable stream like so
try {
const csvFile = fs.createReadStream(`${process.cwd()}/${filename}`);
res
.status(200)
.setHeader("Content-Type", "text/csv")
.setHeader("Content-Disposition", `attachment; filename=${filename}`)
.send(csvFile);
} catch (error) {
res.status(400).json({ error });
}
Here is the tread and the discussion that helped me
Node.js send file in response
The await on that forEach is most definitely not doing what you expect it to do, also you probably shouldn't use await and forEach together
Either switch to using the Sync API for the csv-stringify library or do something along these lines (assuming the first .get() actually contains the actual values from a promise):
[...]
stringifier.pipe(writableStream);
stringifier.on('finish', () => {
const csvFile = await fs.promises.readFile(
`${process.cwd()}/${filename}`,
"utf-8"
);
res.status(200).setHeader("Content-Type", "text/csv").send(csvFile);
});
for (const entry of querySnapshot.docs) {
stringifier.write([entry.data().name, entry.data().email], "utf-8");
);
stringifier.end();
[...]
I'm currently using objects-to-csv package to download data in CSV, seems like this package is missing a feature to spread data across different tables instead of putting everything in one.
Here is my code if that helps at all, but my main question is if there any other method to do that?
const objectstocsv = require("objects-to-csv");
const app = express();
app.get("/", async (req, res) => {
const forumData = await getForumData();
const redditData = await getRedditData();
const allData = forumData.concat(redditData)
const csv = new objectstocsv(allData) <== Puts it all into one table
console.log(csv, "testing result")
// Save to file:
await csv.toDisk("./test.csv", { allColumns: true });
res.download("./test.csv", () => {
fs.unlinkSync("./test.csv");
});
});
The typical format for CSV files is one table per file. I'm not sure how you're trying to combine two possibly different record layouts into a single CSV file.
Another option is to output two CSV files:
const forumData = await getForumData();
const redditData = await getRedditData();
const csvForum = new ObjectsToCsv(forumData);
const csvReddit = new ObjectsToCsv(redditData);
// Save to file:
csvForum.toDisk('./forum.csv', { allColumns: true });
csvReddit.toDisk('./reddit.csv', { allColumns: true });
Then you could combine the CSV files however way you want.
Making a basic blog with an admin section to learn the basics of node and express. I just implemented multer middleware to save images for a blog post to a folder ("images") on the server - not to mongo or an s3 bucket - keeping it simple for now to learn.
I am using EJS and using res.render to send and render the frontend. However, I want to put the image in the EJS file as well. I've tried simply passing in the filename like so:
res.render(path.resolve(__dirname, "../", "views", "posts.ejs"), {postData, file});
postData being the data on the post from the mongodb collection. All this does is send the filename itself which is not helpful.
I've looked around, but don't seem to find an answer to this, or I'm over thinking this?
Here is the rest of the code for the controller:
const path = require("path");
const fs = require('fs');
const Post = require('../models/modelPosts');
exports.getPost = (req, res, next) => {
const postPath = req.params.post;
Post.findOne({ postPath: postPath }, (error, postData) => {
if (error) { return next(error) };
if (postData.postReadyToView == true) {
// find the correct image
fs.readdirSync('./images').forEach(file => {
const stringOfFile = JSON.stringify(file);
const stringOfPathJPEG = JSON.stringify(postPath + ".jpeg");
const stringOfPathJPG = JSON.stringify(postPath + ".jpg");
const stringOfPathPNG = JSON.stringify(postPath + ".png")
// send the ejs file and image
if (stringOfFile == stringOfPathJPEG ||
stringOfFile == stringOfPathJPG ||
stringOfFile == stringOfPathPNG) {
res.render(path.resolve(__dirname, "../", "views", "posts.ejs"), {
postData, file
});
}
})
} else {
res.redirect(404, "/404");
}
})
};
Send the file path of the page to be rendered as data, register the image garden folder (ex: public/images) as a static folder using express.static in nodejs, and load the image when the file path is loaded in the rendering page. I think you can.
I have web appication that can upload excel file. If user upload, the app should parse it and will return some rows that file have. So, The application don't need to save file to its filesystem. Parsing file and return rows is a job. But below code, I wrote this morning, it save file to its server and then parse it.. I think it's waste server resource.
I don't know how to read excel file with createReadStream. Without saving file, how can I parse excel directly? I am not familiar with fs, of course, I can delete file after the job finished, but is there any elegant way?
import { createWriteStream } from 'fs'
import path from 'path'
import xlsx from 'node-xlsx'
// some graphql code here...
async singleUpload(_, { file }, context) {
try {
console.log(file)
const { createReadStream, filename, mimetype, encoding } = await file
await new Promise((res) =>
createReadStream()
.pipe(createWriteStream(path.join(__dirname, '../uploads', filename)))
.on('close', res)
)
const workSheetsFromFile = xlsx.parse(path.join(__dirname, '../uploads', filename))
for (const row of workSheetsFromFile[0].data) {
console.log(row)
}
return { filename }
} catch (e) {
throw new Error(e)
}
},
Using express-fileupload library which provides a buffer representation for uploaded files (through data property), combined with excel.js which accepts a buffers will get you there.
see Express-fileupload and Excel.js
// read from a file
const workbook = new Excel.Workbook();
await workbook.xlsx.readFile(filename);
// ... use workbook
// read from a stream
const workbook = new Excel.Workbook();
await workbook.xlsx.read(stream);
// ... use workbook
// load from buffer // this is what you're looking for
const workbook = new Excel.Workbook();
await workbook.xlsx.load(data);
// ... use workbook
Here's a simplified example:
const app = require('express')();
const fileUpload = require('express-fileupload');
const { Workbook } = require('exceljs');
app.use(fileUpload());
app.post('/', async (req, res) => {
if (!req.files || Object.keys(req.files).length === 0) {
return res.status(400).send('No files were uploaded.');
}
// The name of the input field (i.e. "myFile") is used to retrieve the uploaded file
await new Workbook().xlsx.load(req.files.myFile.data)
});
app.listen(3000)
var xlsx = require('xlsx')
//var workbook = xlsx.readFile('testSingle.xlsx')
var workbook = xlsx.read(fileObj);
You just need to use xlsx.read method to read a stream of data.
you can add an event listener before you pipe the data, so you can do something with your file before it uploaded, it look like this
async singleUpload(_, { file }, context) {
try {
console.log(file)
const { createReadStream, filename, mimetype, encoding } = await file
await new Promise((res) =>
createReadStream()
.on('data', (data)=>{
//do something with your data/file
console.log({data})
//your code here
})
.pipe(createWriteStream(path.join(__dirname, '../uploads', filename)))
.on('close', res)
)
},
you can see the documentation
stream node js
I'm running a NodeJS script that will generate several PDF reports.
Thing is I need to generate several graph for each PDFs, so after several problems, I decided to generate graphs in PNG format, then, make the html page including images. From the HTML, I generate a PDF.
Thing is I don't really need routes, but I need EJS, and I need req / res to generate my graphs:
app.get("/operations/:operation/meters/:meter/weekly_report", async (req, res) => { // Used to generate PNG from graph
const meterId = req.params.meter;
const week = req.query.week;
// get meters from meter
const meter = meters.find(it => it.prm === meterId);
const weeklyData = await generateWeeklyGraphForPRM(meter, week);
ejs.renderFile(path.join(__dirname, './views/partials/', "weekly_graph.ejs"), {
days: weeklyData.days,
conso: weeklyData.consoByHour,
meterLabel: meter.label,
}, (err) => {
if (err) {
res.send(err);
} else {
res.render('partials/weekly_graph.ejs', {
days: weeklyData.days,
conso: weeklyData.consoByHour,
meterLabel: meter.label,
});
}
});
And Then:
async function makePngScreenshot(url, meterId, filename) {
axios.get(url, null); // Make the request to generate html page
const destination = "public/images/" + operation.data.name + "/" + DATE_INI + "_" + DATE_END + "/" + meterId
return new Pageres({delay: 2, filename: filename})
.src(url, ['1300x650'], {crop: true})
.dest(destination)
.run()
}
});
Thing is working, but right now, everything is in index.js
I am trying to break the code into several files.
As I extract each routes into a routes.js, I have the problem that I can't share any longer global vars with all my endpoints.
So, here I find 3 solutions:
Use functions instead of endpoints: I don't need endpoints, but I don't know how to render an EJS file without routes, req / res.
In each routes, get each object again ( inefficient )
Use a redis, or any cache ( ok, but I would like to avoid any extra component for now )
The easiest one should be converting routes into functions, but how can I generate EJS files, without routes, is it possible ?
I hope I understand your task correctly. I made an example of a program that starts using the command line, receives the command line arguments meterId and week, generates a .html file from the .ejs template. I also used the yargs package to easily parse command line arguments.
const path = require('path');
const fs = require('fs');
const argv = require('yargs').argv;
const ejs = require('ejs');
const fsp = fs.promises;
// It would be a good idea to store these parameters in an .env file
const INPUT_FILENAME = 'test.ejs';
const OUTPUT_FILENAME = 'result.html';
const TEMPLATE_FILE = path.resolve(__dirname, './templates', INPUT_FILENAME);
const STORAGE_PATH = path.resolve(__dirname, './storage', OUTPUT_FILENAME);
(async function main({ meterId, week }) {
if (!meterId) {
return console.warn('Specify the command line parameter "meterId"!');
}
if (!week) {
return console.warn('Specify the command line parameter "week"!');
}
try {
const html = await ejs.renderFile(TEMPLATE_FILE, { meterId, week }, { async: true });
await fsp.writeFile(STORAGE_PATH, html);
console.log('Done.');
} catch (error) {
console.error(error);
process.exit(1);
}
})(argv);
And an example command to run the script:
node script.js --meterId=141 --week=44
Please let me know if I understood your task correctly and if my example helps somehow.