I'm currently using objects-to-csv package to download data in CSV, seems like this package is missing a feature to spread data across different tables instead of putting everything in one.
Here is my code if that helps at all, but my main question is if there any other method to do that?
const objectstocsv = require("objects-to-csv");
const app = express();
app.get("/", async (req, res) => {
const forumData = await getForumData();
const redditData = await getRedditData();
const allData = forumData.concat(redditData)
const csv = new objectstocsv(allData) <== Puts it all into one table
console.log(csv, "testing result")
// Save to file:
await csv.toDisk("./test.csv", { allColumns: true });
res.download("./test.csv", () => {
fs.unlinkSync("./test.csv");
});
});
The typical format for CSV files is one table per file. I'm not sure how you're trying to combine two possibly different record layouts into a single CSV file.
Another option is to output two CSV files:
const forumData = await getForumData();
const redditData = await getRedditData();
const csvForum = new ObjectsToCsv(forumData);
const csvReddit = new ObjectsToCsv(redditData);
// Save to file:
csvForum.toDisk('./forum.csv', { allColumns: true });
csvReddit.toDisk('./reddit.csv', { allColumns: true });
Then you could combine the CSV files however way you want.
Related
I saw PDFescape in another post and gave that a shot to be able to edit form field names. Ran my prog and indeed form fields filled but the rest of the contents from my template file were missing.
I tried the Fill Form example file and it couldn't find the form field names, so I also loaded that in PDFescape and saved from there, and then the same result of the rest of the template missing but form fields filled.
I thought maybe PDFescape could be the issue so I purchased the Adobe trial to be able to edit the form field names but there again, pdf-lib doesn't find them (Error: PDFDocument has no form field with the name "prodCode") even though definitely saved as such -
Where the heck am I going wrong?! Code for your reference -
const { PDFDocument } = require('pdf-lib');
const fs = require('fs');
(async () => {
const pdfUTF8 = fs.readFileSync('./test.pdf','utf8')
var formPdfBytes = new TextEncoder("utf-8").encode(pdfUTF8);
// Load a PDF with form fields
const pdfDoc = await PDFDocument.load(formPdfBytes)
// Get the form containing all the fields
const form = pdfDoc.getForm()
// Get all fields in the PDF by their names
const productCodeField = form.getTextField('prodCode')
const certNumberField = form.getTextField('certNumber')
productCodeField.setText('Product code here')
certNumberField.setText('Cert number here')
// Serialize the PDFDocument to bytes (a Uint8Array)
const pdfBytes = await pdfDoc.save()
const data = fs.writeFileSync('./done.pdf', new Buffer.from(pdfBytes))
})().catch(e => {
console.log(e)
});
I read documentation for over a dozen packages and implemented about 5, the only one I could get to work with the requirements being in Node JS and can load a remote PDF file from my MongoDB and fill it was pdfform.js.
const pdfform = require('pdfform.js');
const Mail = require('./tools/sendgrid/sendgrid');
const mongoose = require('mongoose');
mongoose.connect(process.env.MONGODB_URI, {useNewUrlParser: true, useUnifiedTopology: true})
const db = mongoose.connection
db.on('error', (error) => console.error(error))
db.once('open', () => console.log('Connected to Database'))
db.collection('warrantycerts').findOne({_id : 4}, function(err, doc){
if (err) {
console.error(err);
}
const pdf_buf = doc.bin.buffer
var pdfFieldsJson = pdfform().list_fields(pdf_buf)
console.log(pdfFieldsJson)
var fields = {
'prodCode': ['prod code here'],
'certNumber': ['cert here'],
'model': ['model here'],
'serial': ['serial here'],
'date': ['11-10-21'],
};
var out_buf = pdfform().transform(pdf_buf, fields);
Mail.emailAttach(me, 'testing', 'testing', 'cert_test.pdf', new Buffer.from(out_buf).toString('base64'))
})
FYI issue regarding pdf-lib was submitted but not much traction of yet.
So I have a bunch of csv files which have more data then I require, I want to filter the data out by only having keeping the rows with dates after year 2015. The Problem is that it works for a single file but when I enter multiple files it writes the same data in all the streams. So can someone help me out...
Here is my code:
const fastcsv = require('fast-csv');
const csv = require('csv-parser');
const fs = require('fs');
const dateList = new Date('2015-01-01')
let files = fs.readdirSync("dir");
for(let file of files){
var list = []
console.log('<inputPath>'+file);
fs.createReadStream('<inputPath>'+file)
.pipe(csv())
.on('data', (row) => {
//filtering data here
var dateRow = new Date(row.Date);
if(dateRow >= dateList) list.push(row);
})
.on('end', () => {
//my writestream, I dont know if I should make some function or do it here itself
const ws = fs.createWriteStream('<outputPath>' + file)
.then(()=>{
console.log(`CSV file successfully processed : ${file}`);
fastcsv
.write(list, { headers: true })
.pipe(ws);
})
});
}
I am aware that I should use some setTimeout or some callback function but I don't know where exactly
So I am trying to write some data from a table to csv file. It does work most of the times, but sometimes it only writes few of the data to the file and exists the test. So I was wondering if there is a way to wait until all the data has been written to a file and proceed further or something like that.
Here's my sample code:
async writeRecords(records: T[]): Promise<void> {
const headerString = !this.append && this.csvStringifier.getHeaderString();
const recordsString = this.csvStringifier.stringifyRecords(records);
const writeString = (headerString || '') + recordsString;
const option = this.getWriteOption();
await this.write(writeString, option);
this.append = true;
}Is there a way to wait until all the data has been written to a csv file in selenium (nodejs) before exiting the test?
WriteToCSV = async (filename, data) => {
if (!filename) {
await this.logger.Fail('Must provide a filename in order to save a CSV file');
}
const reportPath = await GetReportPath(filename, Strings.CSVExt);
const writer = createArrayCsvWriter({
path: reportPath,
});
await this.logging.Trace(`dataset=${data}`);
const formattedData = await this.FormatCSVData(data);
await this.logging.Trace(`formatted=${formattedData}`);
await writer.writeRecords(formattedData);
};
Correct me if I am wrong but how about using with open (filename) or something similar??
I've written a script in node using puppeteer to fetch different names and the links to their profiles from a webpage. The script is fetching them in the right way.
What I wish to do now is write the data in a csv file but can't find any idea how to do so. I have come across many tuts which describe about writing the same but most of them are either incomplete or using such libraries which are no longer being maintained.
This is what I've written so far:
const puppeteer = require('puppeteer');
const link = "https://www.ak-brandenburg.de/bauherren/architekten_architektinnen";
(async ()=> {
const browser = await puppeteer.launch()
const [page] = await browser.pages()
await page.goto(link)
const listItem = await page.evaluate(() =>
[...document.querySelectorAll('.views-table tr')].map(item => ({
name: item.querySelector('.views-field-title a').innerText.trim(),
profilelink: "https://www.ak-brandenburg.de" + item.querySelector('.views-field-title a').getAttribute("href"),
}))
);
console.log(listItem);
await browser.close();
})();
How can I write the data in a csv file?
There is a far easier way to achieve the same. If you check out this library, you can write the data in a csv file very easily.
Working script:
const fs = require('fs');
const Json2csv = require('json2csv').Parser;
const puppeteer = require('puppeteer');
const link = "https://www.ak-brandenburg.de/bauherren/architekten_architektinnen";
(async ()=> {
const browser = await puppeteer.launch()
const [page] = await browser.pages()
await page.goto(link)
const listItem = await page.evaluate(() =>
[...document.querySelectorAll('.views-table tbody tr')].map(item => ({
name: item.querySelector('.views-field-title a').innerText.trim(),
profilelink: "https://www.ak-brandenburg.de" + item.querySelector('.views-field-title a').getAttribute("href"),
}))
);
const j2csv = new Json2csv(['name','profilelink']);
const csv = j2csv.parse(listItem);
fs.writeFileSync('./output.csv',csv,'utf-8')
await browser.close();
})();
I haven't worked with puppeteer but I have created csv file in my node project
Store your data in a array eg: csvData
Then use fs.writeFile to save your csv data.
`fs.writeFile(`path/to/csv/${csvName}.csv`, csvData, 'utf8', function(err) {
if (err) {
console.log('error', err)
}
res.send({
url: `path/to/csv/${csvName}.csv`
})
})`
only use res.send if you want to send csv file from server to client
I want to export my data into csv file so for that purpose i used fast-csv in node js. its working fine my code is
var csv = require("fast-csv");
app.get('/file/exportToExcel', function(req, res) {
var whereCondi = {
'Id': req.query.id,
}
var response = {};
table.find(whereCondi, function(err, tableData) {
if (err) {
response.status = 400;
response.message = err.message;
res.send(response);
} else {
var csvStream = csv.createWriteStream({headers: true}),
writableStream = fs.createWriteStream("code.csv");
writableStream.on("finish", function() {
});
csvStream.pipe(writableStream);
for (var i = 0; i < tableData.length; i++) {
csvStream.write({allcodes: tableData[i].code});
}
csvStream.end();
}
});
});
but the problem is its saving that csv file in my root folder i want to download that csv file when user click on export to excel please help me.
writableStream = fs.createWriteStream("coupons.csv");
This looks to be your problem if I'm understanding you correctly. The current code saves the csv file relative to the app file (basically in the same directory in your case).
Try something like:
writableStream = fs.createWriteStream("./some/directory/coupons.csv");
You should create the csv file in your directory an then delete it in the same way like that
const express = require('express')
const objectstocsv = require('objects-to-csv')
const fs = require("fs")
const app = express()
var data = [
{code: 'CA', name: 'California'},
{code: 'TX', name: 'Texas'},
{code: 'NY', name: 'New York'},
];
const PORT = process.env.PORT || 5000
app.get('/',async(req,res) => {
const csv = new objectstocsv(data);
// Save to file:
await csv.toDisk('./test.csv');
// Download the file
res.download("./test.csv",() => {
//Then delete the csv file in the callback
fs.unlinkSync("./test.csv")
})
})
Very late to the game but wanted to add in case other people were encountering same hurdle. Not sure if this is an ideal solution since I just started learning, but I got around this problem by wrapping the csv creation in an async function, and having that function called when a user clicks on a button.
Essentially, user clicks button > GET request to specific path > export csv and render success page.
index.js or server js file
const exportToCsv = () => {
...some code to get data and generate csv...
};
app.get('/download', async (req, res) => {
exportToCsv();
res.render('<your desired view>');
};
view or html
<button type='submit' onclick='someNavFunction()'>Export</button>
The someNavFunction() can be a helper function to navigate to a new path or some other solution that helps you hit '/download' that you created route for in server file.
This solution worked for me because I wanted to render a success page after download. You can add additional validation to only render if exported successfully etc.