fast csv alwaysWriteHeaders true wont work node js - node.js

I am new to node.js. I have to email the retrieved data in CSV format. I wrote code and it is working fine but with an empty array the headers are not included in the CSV, I get an empty excel sheet.
I used
var fastcsv = require("fast-csv");
var format = require('#fast-csv/format');
var csv = fastcsv.write(finalData, {headers: true});
I tried the code below, but it gives me the "format is not a function" error
var csv = fastcsv.write(finalData, {headers: true});
var csv = csv.format({alwaysWriteHeaders:true})
please help

Try use it like this, restructure it.
var fastcsv = require("fast-csv");
var { format } = require('#fast-csv/format');
var csv = fastcsv.write(finalData, {headers: true});
var formated = format({ alwaysWriteHeaders:true })
console.log(formated)

Related

NodeJS: download CSV with multiple data tables

I'm currently using objects-to-csv package to download data in CSV, seems like this package is missing a feature to spread data across different tables instead of putting everything in one.
Here is my code if that helps at all, but my main question is if there any other method to do that?
const objectstocsv = require("objects-to-csv");
const app = express();
app.get("/", async (req, res) => {
const forumData = await getForumData();
const redditData = await getRedditData();
const allData = forumData.concat(redditData)
const csv = new objectstocsv(allData) <== Puts it all into one table
console.log(csv, "testing result")
// Save to file:
await csv.toDisk("./test.csv", { allColumns: true });
res.download("./test.csv", () => {
fs.unlinkSync("./test.csv");
});
});
The typical format for CSV files is one table per file. I'm not sure how you're trying to combine two possibly different record layouts into a single CSV file.
Another option is to output two CSV files:
const forumData = await getForumData();
const redditData = await getRedditData();
const csvForum = new ObjectsToCsv(forumData);
const csvReddit = new ObjectsToCsv(redditData);
// Save to file:
csvForum.toDisk('./forum.csv', { allColumns: true });
csvReddit.toDisk('./reddit.csv', { allColumns: true });
Then you could combine the CSV files however way you want.

Handling Asynchronous stream: Read and Write multiple csv files after filtering dates in nodejs

So I have a bunch of csv files which have more data then I require, I want to filter the data out by only having keeping the rows with dates after year 2015. The Problem is that it works for a single file but when I enter multiple files it writes the same data in all the streams. So can someone help me out...
Here is my code:
const fastcsv = require('fast-csv');
const csv = require('csv-parser');
const fs = require('fs');
const dateList = new Date('2015-01-01')
let files = fs.readdirSync("dir");
for(let file of files){
var list = []
console.log('<inputPath>'+file);
fs.createReadStream('<inputPath>'+file)
.pipe(csv())
.on('data', (row) => {
//filtering data here
var dateRow = new Date(row.Date);
if(dateRow >= dateList) list.push(row);
})
.on('end', () => {
//my writestream, I dont know if I should make some function or do it here itself
const ws = fs.createWriteStream('<outputPath>' + file)
.then(()=>{
console.log(`CSV file successfully processed : ${file}`);
fastcsv
.write(list, { headers: true })
.pipe(ws);
})
});
}
I am aware that I should use some setTimeout or some callback function but I don't know where exactly

Using fs read file I want to the json data in variable to pass to the nodejs

I am trying to use fs.readFile in my nodejs project. I want to read the file from a different location in the computer system and read it. After reading I want to store that JSON data into the object and access it into my project. Can anyone please help me with this I am stuck with a problem for a long time.
fs = require('fs');
path = require('path');
const location = path.join('/users/', 'hello.json');
let rawdata = fs.readFile(location, {encoding: 'utf-8'}, function(err, data){
let Issuedata = data;
});
You try this
const {readFile} = require('fs/promises'); // using fs-promises
const path = require('path');
const file = path.join("/users/", "hello.json");
const dataObj = {}; // object you intend to use
readFile(file, {encoding: 'utf-8'}).then((result) => {
if(!dataObj[result]) {
dataObj[result] = JSON.parse(result);
}
});
console.log(dataObj) // result
The JSON file is read first, the result gotten from the readFile function is appended to the dataObj object. Hope it can help you or steer you in the right direction

Output looped PIWIK API call to CSV

I've got the following problem: I am looping through API calls using different dates to append the output to a CSV file. However, the output data in the CSV file only contains data from the first date.
When I log the results to the command prompt I do get multiple dates, meaning the problem occurs when writing the output to the CSV.
Moment.js is used for setting the start and end date to loop through and fast-csv to write the output of the API call to a CSV file.
// load and configure
const piwik = require ('piwik').setup ('placeholderurl', 'XXXXX');
// filesystem requirement
var fs = require('fs');
// fast-csv requirement
var csv = require("fast-csv");
// moment.js requirement
var moment = require('moment');
// variabelen voor het loopen door datums
var a = moment().format('2016-05-12');
var b = moment().format('2016-05-15');
var stream = fs.createWriteStream ('my.csv', {flags: 'a'})
// samenstellen API url
for (var m = moment(a); m.isBefore(b); m.add(1, 'days')) {
piwik.api (
{
method: 'Live.getLastVisitsDetails',
idSite: 3,
period: 'day',
format: 'csv',
date: moment(m).format('YYYY-MM-DD')
},
function (err, data) {
if (err) {
console.log (err);
return;
}
console.log(data)
csv
.writeToStream(fs.createWriteStream("my.csv"), data, {flags: 'a', headers: true});
}
);
}
API token and url removed for privacy reasons.
Solved. Got rid of the PIWIK API package and decided to use HTTP GET to retrieve the url manually.
The code:
// http requirement
var http = require('http');
var request = require('request');
// filesystem requirement
var fs = require('fs');
// moment.js requirement
var moment = require('moment');
// variabelen voor het loopen door datums
var a = moment().format('2016-05-12');
var b = moment().format('2016-05-15');
var m = moment(a);
//var stream = fs.createWriteStream ('my.csv', {flags: 'a'})
// samenstellen API url
for (var m = moment(a); m.isBefore(b); m.add(1, 'days')) {
request
.get("http://placeholder.com/?module=API&method=Live.getLastVisitsDetails&idSite=3&period=day&date=" + moment(m).format('YYYY-MM-DD') + "&format=csv&token_auth=placeholdertoken&filter_limit=-1")
.on('error', function(err) {
console.log(err)
})
.pipe(fs.createWriteStream('data-' + moment(m).format('YYYY-MM-DD') + '.csv'))
console.log(moment(m).format('YYYY-MM-DD') + " " + "saved")
}

Convert PDF to doc or docx format using npm/node module or via code?

I haven't been successful in finding a node module or some code that can convert a PDF to doc/docx format using Node JS. Is there any way to do it?
You can use Aspose.Words Cloud SDK for Node.js to convert PDF to DOCX from npm
#Convert PDF to DOCX from Cloud Storage
const { WordsApi, SaveOptionsData } = require("asposewordscloud");
const { UploadFileRequest, SaveAsRequest }= require("asposewordscloud/dist/model/model");
var fs = require('fs');
# Please get your App Key and App SID from https://dashboard.aspose.cloud
wordsApi = new WordsApi("APP_KEY", "APP_SID");
const remotename = "02_pages.pdf";
const remoteTempFolder = "Temp";
const request = new SaveAsRequest({
saveOptionsData: new SaveOptionsData({
saveFormat: "docx",
fileName: "TestPostDocumentSavePdfAsDocx.docx",
}),
});
request.name = remotename;
request.folder = remoteTempFolder;
wordsApi.saveAs(request).then((result) => {
console.log(result.body);
}).catch(function(err) {
// Deal with an error
console.log(err);
});
# Convert PDF to DOCX from request stream
const { WordsApi, ConvertDocumentRequest } = require("asposewordscloud");
var fs = require('fs');
// Get Customer ID and Customer Key from https://dashboard.aspose.cloud/
wordsApi = new WordsApi("xxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxx", "xxxxxxxxxxxxxxxxxxxx");
var request = new ConvertDocumentRequest({
format: "docx",
document: fs.createReadStream("C:/Temp/02_pages.pdf"),
});
var outputFile = "C:/Temp/ConvertPDFtotxt.docx";
wordsApi.convertDocument(request).then((result) => {
console.log(result.response.statusCode);
console.log(result.body.byteLength);
fs.writeFileSync(outputFile, result.body);
}).catch(function(err) {
// Deal with an error
console.log(err);
});
I'm developer evangelist at aspose.
node-unoconv
A node.js wrapper for converting documents with unoconv.
https://github.com/gfloyd/node-unoconv
You can try this service https://docs.groupdocs.cloud/conversion/convert-document/. It allows to convert a lot of various formats.

Resources