csv-parser cannot read or open such .csv file - node.js

csv file is
and in my index.js
here is my code :
const fs = require('fs');
const csv = require('csv-parser');
const inputFile = "./data.csv"
let results = []
fs.createReadStream(inputFile)
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results);
});
why am i getting an error: that no such file or directory './data.csv'?

When specifying file paths in node, generally relative paths are derived from the working directory from where node itself was executed. This means if you execute
node ./backEnd/index.js
The actual working directory is whatever directory is above backEnd. You can see this via console.log(process.cwd()).
If you would like to read a file relative to the current file that is being executed, you can do:
const fs = require('fs');
const path = require('path');
const csv = require('csv-parser');
const inputFile = path.resolve(__dirname, "./data.csv");
let results = []
fs.createReadStream(inputFile)
.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(results);
});
Specifically __dirname will always resolve to the directory of the javascript file being executed. Using path.resolve is technically optional here, you could manually put the path together, however using resolve is a better practice.

Related

Renaming file not having proper assigned name

I was trying to watch a certain directory and when a new file is added to that directory I want to rename the file but it's not working. The problem is the directory watching part works fine but when I rename the newly added file the name I am giving it is iterated and giving it the wrong name. For Example, if the new name I'm assigning is thisIsName when it gets renamed it becomes thisIsNamethisIsNamethisIsNamethisIsName. How can I make it so that the rename is the assigned name without any iteration? Any help is appreciated. Thanks in advance.
const fs = require("fs");
const chokidar = require('chokidar');
const watcher = chokidar.watch('filePath', {
ignored: /(^|[\/\\])\../,
persistent: true
});
function yyyymmdd() {
var now = new moment();
return now.format("YYYYMMDD");
}
function hhmmss() {
var now = new moment();
return now.format("HHmmss");
}
const log = console.log.bind(console);
//watching a certain directory for any update to it
watcher
.on('add', path => {
const newFileName = "filePath\\" + yyyymmdd() + hhmmss() + path
//trying to rename the file, but its not working because newFileName is somehow getting looped and having multiple iterations of the DATE and TIME in the new name when getting renamed. Example of what the product looks like is included above in the question.
fs.renameSync(path, newFileName);
})
.on('change', path => {
log(`File ${path} has been changed`)
})
.on('unlink', path => {
log(`File ${path} has been removed`)
})
I've done some small changes in your code and it worked for me for any file formats (for unformatted files as well). Anyway, use as you want. The only thing you've missed, was the usage of "path":
const moment = require('moment');
const fs = require('fs');
const chokidar = require('chokidar');
const path = require('path');
const log = console.log.bind(console);
function formattedDate() {
return moment().format('YYYYMMDDHHmmss');
}
// here I've used some folder with name "folder" in the same directory as this file
const filePath = path.join(__dirname, `./folder`);
const watcher = chokidar.watch(filePath, {
ignored: /(^|[\/\\])\../,
persistent: true
});
watcher
.on('add', addedFilePath => {
const fileExt = path.extname(addedFilePath);
const newFilePath = path.join(__dirname, `./folder/${formattedDate()}${fileExt}`);
fs.renameSync(addedFilePath, newFilePath);
})
.on('change', changedFilePath => {
log(`File ${changedFilePath} has been changed`);
})
.on('unlink', removingFilePath => {
log(`File ${removingFilePath} has been removed`);
});
Here is the stuff:

Handling Asynchronous stream: Read and Write multiple csv files after filtering dates in nodejs

So I have a bunch of csv files which have more data then I require, I want to filter the data out by only having keeping the rows with dates after year 2015. The Problem is that it works for a single file but when I enter multiple files it writes the same data in all the streams. So can someone help me out...
Here is my code:
const fastcsv = require('fast-csv');
const csv = require('csv-parser');
const fs = require('fs');
const dateList = new Date('2015-01-01')
let files = fs.readdirSync("dir");
for(let file of files){
var list = []
console.log('<inputPath>'+file);
fs.createReadStream('<inputPath>'+file)
.pipe(csv())
.on('data', (row) => {
//filtering data here
var dateRow = new Date(row.Date);
if(dateRow >= dateList) list.push(row);
})
.on('end', () => {
//my writestream, I dont know if I should make some function or do it here itself
const ws = fs.createWriteStream('<outputPath>' + file)
.then(()=>{
console.log(`CSV file successfully processed : ${file}`);
fastcsv
.write(list, { headers: true })
.pipe(ws);
})
});
}
I am aware that I should use some setTimeout or some callback function but I don't know where exactly

How can I create a single read stream from two files in Node?

I'm pretty new to NodeJS streams and fileStreams. I'm trying to parse two XML files using SAX. I've succeeded in getting it to work for a single file:
const fs = require('fs');
const sax = require("sax");
const saxStream = sax.createStream(IS_STRICT, OPTIONS);
saxStream.on("error", function (e) { ... });
...
const out = fs.createReadStream(INFILE).pipe(saxStream);
How can I pipe two files into SAX?
Update
I'm trying to put the output of SAX into a single file. Here's the SAX I'm using, which is an XML parser that uses streams:
https://www.npmjs.com/package/sax
Well, I tried coding a way to do this, but the SAX parser only accepts one root node in the XML input so once it's done with the first file, it ignores all the XML in a 2nd XML you feed it because it's outside the root node.
So, if you want to parse the 2nd file, it looks like you need to create a 2nd sax.createStream() and feed it the 2nd file. As always, we could offer you a more complete suggestion if you showed us what you're actually trying to do with the parsed XML input.
FYI, here's what I tried:
const fs = require('fs');
const sax = require("sax");
const saxStream = sax.createStream(false, {trim: true, normalize: true});
saxStream.on("error", e => {
console.log("saxStream error", e);
});
saxStream.on("opentag", node => {
console.log(node);
});
saxStream.on("end", () => {
console.log("done with saxStream");
})
let stream1 = fs.createReadStream("./sample1.xml");
stream1.pipe(saxStream, {end: false});
stream1.on("end", () => {
console.log("starting stream2")
fs.createReadStream("./sample2.xml").pipe(saxStream, {end: true});
});
I stepped through the parser in the debugger and the 2nd file's input is successfully fed into the SAX parser, it just ignores it because it's outside of the root node.
There are several places in this file where it checks for parser.closedRoot and if so, it skips the content.
Actually, I did get it to work by adding a fake root tag that could enclose both sets of XML. I have no idea if this is what you want, but you can examine this for educational purposes:
const fs = require('fs');
const sax = require("sax");
const saxStream = sax.createStream(false, {trim: true, normalize: true});
saxStream.on("error", e => {
console.log("saxStream error", e);
});
saxStream.on("opentag", node => {
console.log(node);
});
saxStream.on("end", () => {
console.log("done with saxStream");
})
let stream1 = fs.createReadStream("./sample1.xml");
let stream2 = fs.createReadStream("./sample2.xml");
saxStream.write("<fakeTop>");
stream1.pipe(saxStream, {end: false});
stream1.on("end", () => {
console.log("starting stream2")
stream2.pipe(saxStream, {end: false});
stream2.on("end", () => {
saxStream.write("</fakeTop>");
saxStream.end();
});
});

Files array return undefined in readdir

I am trying to read all the files inside a nested folder. I am able to reach the innermost layer of directory but when I tried readdir method on the folders (that contains the files I want to use) the readdir method returned undefined in files array
My directory structure is like this.
main directory=logs which contains many directories like logs_of_29,logs_of_19 each of these contains different folders with names like 157486185 and each of those contains logs files
so the path to the file I am trying to read content of will be
logs\logs_of_41\1565605874284\file.json
How do I read data of this nested file
I have tried following
var http = require('http');
var fs = require('fs');
http.createServer(function (requests, response) {
handle_files()
}).listen(8080)
handle_files = () => {
// get the list of directories of logs_of_109
fs.readdir('logs_of_109', function (error, files) {
files.map(file => {
var sub_directory = file
// get list of all the directories inside logs_of_109 subfolders
fs.readdir('logs_of_109/' + sub_directory, function (error, files) {
var sub_directory2 = files
console.log('logs_of_109/' + sub_directory + sub_directory2)
files.map(file => {
fs.readdir('logs_of_109/' + sub_directory + sub_directory2, function (error, files) {
files.map(file => { console.log(file) })
})
})
})
})
})
}
Now the file in the innermost loop gives me undefined. Also this approach is very repetitive. Is there any better way to do this and can someone explain why file array is logging undefined on the console
I would suggest using a recursive approach, this is normally the easiest way to proceed with this kind of problem. The code below should accomplish what you wish. The server will respond with a list of files in the specified folder:
const fs = require('fs');
const path = require('path');
const { promisify } = require('util');
const getStats = promisify(fs.stat);
const readdir = promisify(fs.readdir);
const http = require('http');
handle_files = async (req, res) => {
let files = await scanDir("logs_of_109");
console.log(`Scan complete, file count: ${files.length}`);
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write(files.join(", "));
res.end();
};
http.createServer(handle_files).listen(8080)
async function scanDir(dir, includeDirectories = false, fileList = []) {
let files = await readdir(dir);
for(let file of files) {
let filePath = path.join(dir, file);
try {
let stats = await getStats(filePath);
let isDirectory = stats.isDirectory();
if (includeDirectories || !isDirectory) {
fileList.push(filePath);
}
if (stats.isDirectory()) {
await scanDir(filePath, fileList);
}
} catch (err) {
// Drop on the floor..
}
}
return fileList;
}

fs-extra copy file outputs blank file

When I run the following code a blank file gets created with the correct name. I clearly dont want a blank file.
I know the path is correct because when i make it purposely incorrect it fails (obviously)
const path = require('path');
const fse = require('fs-extra');
const OUTPUT_PATH = 'js/libs/';
const _NODE_MODULES = 'node_modules/';
const filePath = `${_NODE_MODULES}tooltipster/dist/js/tooltipster.bundle.min.js`;
fse.copy(path.join(__dirname, filePath), path.join(__dirname, `${OUTPUT_PATH}/something.js`), err => {
if (err) {
console.log(err);
process.exit(1)
}
console.log('Copy complete');
process.exit(0);
})
Output of this is
Copy Complete
But the file is blank as I previously stated. Any idea what I'm doing wrong here?
I've modified Your code and checked on my PC.
So result: http://joxi.ru/ZrJEEJh1KXw1Aj
Checkout this code:
const path = require('path');
const fs = require('fs-extra');
let sourceFile = path.join(__dirname, 'node_modules', 'tooltipster/dist/js/tooltipster.bundle.min.js');
let destinationFile = path.join(__dirname, 'js/libs', 'something.js');
fs.copy(sourceFile, destinationFile, err => {
if (err) {
return console.error(err);
}
console.log('Copy complete');
});
if it fail again so, be sure that there is no issue with code.
check Your filesystem maybe there is some open file limits, permission problems or no free space.
also I can guess that the source file is empty, so do:
cat node_modules/tooltipster/dist/js/tooltipster.bundle.min.js
Your call to process.exit interfered/aborted before it could finish. Don't need to call process.exit. It will exit when everything is done.

Resources