node.js Combining multiple strings into a multiline string - node.js

module.exports = {
name: "help",
execute(msg, args){
const fs = require("fs");
const commandFiles = fs.readdirSync("./commands/").filter(file => file.endsWith(".js"));
for (const file of commandFiles){
const name = file.slice(0, -3);
const descriptionFileName = name.concat(".desc");
const descriptionFile = `./commands/${descriptionFileName}`;
var output = "Help:";
fs.readFile(descriptionFile, function(err, data){
const helpLine = name.concat(" - ",data.toString());
output = output + "\n" + helpLine
});
msg.channel.send(output);
}
}
}
Expected output:
help - description
ping - description
Output:
Help:
Help:
Any idea why that happens?
Im new at coding and very new at js.

you didn't get the expected result because readFile(file, cb) reads a file asynchronously. This means that it just schedule a callback cb to be executed once the I/O operation has been completed. However the following code:
msg.channel.send(output)
will be executed synchronously so the output will remain with the initial value.
One way to handle this could be with promises, here a partial example based on your code:
module.exports = {
name: 'help',
async execute(msg, args) {
const { readFile, readdir } = require('fs').promises;
const fs = require('fs');
const commandFiles = await readdir('./commands/').filter((file) => file.endsWith('.js'));
const promises = [];
for (const file of commandFiles) {
promises.push(
fs.readFile(file)
)
}
const results = await Promise.all(promises);
// manipulate results as you want
msg.channel.send(results);
},
};
Note that because of the async prefix the exported execute function you need to handle a promise in the consumer of this module
Another approach could be to use a fully parallel control flow pattern
Some references:
promises
async/await
control flow

Related

fs.watchFile() a json file until a specific value appear

So I have a json file that changes continously and I need to read it AFTER a value called auth-token is written to the file, here what I get now:
const json = fs.readFileSync("some-json.json")
const headers = JSON.parse(json);
return headers
But it reads the file before anything can be written to it, is there anyway that I can use fs.watchFile() and watch the file UNTIL the value is written?
Thanks
You can use fs.watch although its behavior is a bit unreliable with multiple events triggered upon file change (but I don't think it would be a problem here).
Here is a small sample:
const { watch } = require('fs');
const { readFile } = require('fs/promises');
(async () => {
const result = await new Promise((resolve) => {
const watcher = watch('some-json.json', async (eventType, filename) => {
try {
const fileContent = await readFile(filename);
const headers = JSON.parse(fileContent.toString());
if (headers['auth-token']) { // or whatever test you need here
watcher.close();
resolve(headers);
}
} catch (e) {}
});
});
console.log(result);
})();
Note that if your file gets modified many times before it contains the desired header, it might be preferable to replace the usage of fs.watch by a setInterval to read the file at regular intervals until it contains the value you expect.
Here is what it would look like:
const { readFile } = require('fs/promises');
(async () => {
const waitingTime = 1000;
const result = await new Promise((resolve) => {
const interval = setInterval(async (eventType, filename) => {
const fileContent = await readFile('some-json.json');
try {
const headers = JSON.parse(fileContent.toString());
if (headers['auth-token']) { // or whatever test you need here
clearInterval(interval);
resolve(headers);
}
} catch (e) {}
}, waitingTime);
});
console.log(result);
})();

Recursive read images files in provided folder

I am getting folder absolute path and I want to extract all text files path from the folder (recursively).
This is what I've tried:
const fs = require('fs-extra');
const path = require('path');
const getFolderImagesRecursive = async (folderPath) => {
const directoryChildren = await fs.readdir(folderPath);
return directoryChildren.reduce(async (finalArray, directoryChild) => {
const fullPath = path.join(folderPath, directoryChild);
const pathStat = await fs.lstat(fullPath);
if (pathStat.isDirectory()) {
const recursiveResult = await getFolderImagesRecursive(fullPath);
return [
...finalArray,
...recursiveResult,
];
} else if (fullPath.split('.').pop() === 'txt') {
return [
...finalArray,
fullPath,
]
} else {
return finalArray;
}
}, []);
}
For testing purpose I've created dummy folders and text files or folders nested inside. When tried the function on the main test folder I got:TypeError: object is not iterable (cannot read property Symbol(Symbol.iterator)) on line 21.
Does anyone observe the error and can fix it?
The problem is that array.reduce is a synchronous function. And when you pass it an async function the result it gets is a promise. Not an array. The problem is that the finalArray (after the first iteration) is a promise that returns an array. And you're trying to destructure the promise on line 21. You should rewrite your code using loops instead of async/await like this
const fs = require('fs-extra');
const path = require('path');
const getFolderImagesRecursive = (folderPath) => {
const directoryChildren = fs.readdirSync(folderPath);
const finalArray = [];
directoryChildren.forEach(directoryChild => {
const fullPath = path.join(folderPath, directoryChild);
const pathStat = fs.lstatSync(fullPath);
if (pathStat.isDirectory()) {
const recursiveResult = getFolderImagesRecursive(fullPath);
finalArray.concat(recursiveResult);
} else if (fullPath.split('.').pop() === 'txt') {
finalArray.push(fullPath);
}
})
return finalArray;
}
}

nodeJS async function parse csv return data to other file

I'm creating a small tool for internal user with puppeteer.
Basically I got a csv file with some data i "read" and fill form with.
As I try to cleanup my project to be reusable i'm struggle a little bit:
I create a file name parsecsv.js
const config = require('../config.json');
const parse = require('csv-parse');
const fs = require('fs');
const processFile = async () => {
records = []
const parser = fs
.createReadStream(config.sourceFile)
.pipe(parse({
// CSV options
from_line: 1,
delimiter: ";",
}));
let i =1;
for await (const record of parser) {
records.push(record)
i++;
}
return records
}
const processFileData = async () => {
const records = await processFile()
console.info(records);
return records
}
module.exports ={
processFile, processFileData
}
in an other Js file i made
const parseCSV = require('./src/ParseCsv');
const records = parseCSV.processFileData();
const data = parseCSV.processFile();
console.log(typeof records);
console.table(records);
console.log(typeof data);
console.table(data);
But I never get my data only an empty oject.
How I can get my data to be able to "share" it with other function ?
thanks
as your functions are async ones and they return a promises, you can do something like
const parseCSV = require('./src/ParseCsv');
(async () => {
const records = await parseCSV.processFileData();
const data = await parseCSV.processFile();
console.log(typeof records);
console.table(records);
console.log(typeof data);
console.table(data);
})()

node.js synchronous file reading operation problem?

Problem Statement:
Complete function readFile to read the contents of the file sample.txt
and return the content as plain text response.
Note:
make sure when you read file mention its full path.
for e.g - suppose you have to read file xyz.txt
then instead of writing './xyz.txt' or 'xyz.txt'
write like ${__dirname}/xyz.txt
My Code:
const fs = require('fs');
const path = require('path');
let readFile = () => {
let file = path.join(__dirname,'/xyz.txt') ;
let variableFile = fs.readFileSync(file);
return variableFile.toString();
};
module.exports = {
readFile:readFile
};
You have to pass an encoding parameter to readFileSync or it will return a buffer:
const variableFile = fs.readFileSync(file, "utf8");
return variableFile;
PS: You should not use synchronous calls in production, there is a very neat API called "promisify" that allows you to use async/await or promises with fs:
const {promisify} = require('util');
const fs = require('fs');
const readFile = promisify(fs.readFile);
const example = async () => {
const file = await readFile(/*...*/);
}

Reading a csv file async - NodeJS

I am trying to create a function where I can pass file path and the read the file in async way. What I found out was that it supports streams()
const fs = require('fs');
var parse = require('csv-parse');
var async = require('async');
readCSVData = async (filePath): Promise<any> => {
let csvString = '';
var parser = parse({delimiter: ','}, function (err, data) {
async.eachSeries(data, function (line, callback) {
csvString = csvString + line.join(',')+'\n';
console.log(csvString) // I can see this value getting populated
})
});
fs.createReadStream(filePath).pipe(parser);
}
I got this code from here. but I am new to node js so I am not getting how to use await to get the data once all lines are parsed.
const csvData = await this.util.readCSVData(path)
My best workaround for this task is:
const csv = require('csvtojson')
const csvFilePath = 'data.csv'
const array = await csv().fromFile(csvFilePath);
This answer provides legacy code that uses async library. Promise-based control flow with async doesn't need this library. Asynchronous processing with async.eachSeries doesn't serve a good purpose inside csv-parse callback because a callback waits for data to be filled with all collected data.
If reading all data into memory is not an issue, CSV stream can be converted to a promise:
const fs = require('fs');
const getStream = require('get-stream');
const parse = require('csv-parse');
readCSVData = async (filePath): Promise<any> => {
const parseStream = parse({delimiter: ','});
const data = await getStream.array(fs.createReadStream(filePath).pipe(parseStream));
return data.map(line => line.join(',')).join('\n');
}

Resources