Array is empty after pushing items (synchronous) - node.js

I am trying to read file from a .txt file in nodejs, when i get access to each line, I push it to an array but in the end, the array is empty.
var array=[];
let lineReader = require('readline').createInterface({
input: require('fs').createReadStream('file.txt')
});
lineReader.on('line', function (line) {
console.log(line);
array.push(line);
});
console.log(array);
.Txt File Content
THIS IS LINE#1
THIS IS LINE#2
THIS IS LINE#3
Output
[]
THIS IS LINE#1
THIS IS LINE#2
THIS IS LINE#3

The problem is that the thing happens async. You need to add a listener for close event.
lineReader.on("close", () => {
console.log(array);
})

Related

Asynchronous file read reading different number of lines each time, not halting

I built a simple asynchronous implementation of the readlines module built into nodejs, which is simply a wrapper around the event-based module itself. The code is below;
const readline = require('readline');
module.exports = {
createInterface: args => {
let self = {
interface: readline.createInterface(args),
readLine: () => new Promise((succ, fail) => {
if (self.interface === null) {
succ(null);
} else {
self.interface.once('line', succ);
}
}),
hasLine: () => self.interface !== null
};
self.interface.on('close', () => {
self.interface = null;
});
return self;
}
}
Ideally, I would use it like so, in code like this;
const readline = require("./async-readline");
let filename = "bar.txt";
let linereader = readline.createInterface({
input: fs.createReadStream(filename)
});
let lines = 0;
while (linereader.hasLine()) {
let line = await linereader.readLine();
lines++;
console.log(lines);
}
console.log("Finished");
However, i've observed some erratic and unexpected behavior with this async wrapper. For one, it fails to recognize when the file ends, and simply hangs once it reaches the last line, never printing "Finished". And on top of that, when the input file is large, say a couple thousand lines, it's always off by a few lines and doesn't successfully read the full file before halting. in a 2000+ line file it could be off by as many as 20-40 lines. If I throw a print statement into the .on('close' listener, I see that it does trigger; however, the program still doesn't recognize that it should no longer have lines to read.
It seems that in nodejs v11.7, the readline interface was given async iterator functionality and can simply be looped through with a for await ... of loop;
const rl = readline.createInterface({
input: fs.createReadStream(filename);
});
for await (const line of rl) {
console.log(line)
}
How to get synchronous readline, or "simulate" it using async, in nodejs?

NodeJs set readline module speed

Im reading a text file in NodeJs using readline module.
var lineReader = require('readline').createInterface({
input: require('fs').createReadStream('log.txt')
});
lineReader.on('line', function (line) {
console.log(line);
});
lineReader.on('close', function() {
console.log('Finished!');
});
Is there any way to set the time of the reading?
For example i want to read each line every 5msec.
You can pause the reader stream as soon as you read a line. Then resume it 5ms later. Repeat this till the end of file. Make sure to adjust highWaterMark option to a lower value so that the file reader stream doesn't read multiple lines at once.
var lineReader = require('readline').createInterface({
input: require('fs').createReadStream('./log.txt',{
highWaterMark : 10
})
});
lineReader.on('line', line => {
lineReader.pause(); // pause reader
// Resume 5ms later
setTimeout(()=>{
lineReader.resume();
}, 5)
console.log(line);
});
You can use observables to do this. Here's an example of the kind of buffering I think you want with click events instead of file line events. Not sure if there's a cleaner way to do it that avoids the setInterval though....
let i = 0;
const source = Rx.Observable
.fromEvent(document.querySelector('#container'), 'click')
.controlled();
var subscription =
source.subscribe(() => console.log('was clicked ' + i++));
setInterval(() => source.request(1), 500);
Here's a fiddle and also a link to docs for rx:
https://jsfiddle.net/w6ewg175/
https://github.com/Reactive-Extensions/RxJS/blob/master/doc/api/core/operators/controlled.md

Get Data from CSV File in nodejs

I have a csv file having about 10k records. I need to retrieve it one by one in my nodejs app.
The scenario is there is when user clicks button "X" first time, the async request is sent to nodejs app which gets data from first row from CSV file. When he clicks again, it'll show data from second row and it keeps on going.
I tried using fast-csv and lazy but all of them read the complete file. Is their a way I can achieve tihs?
Node comes with a readline module in it's core, allowing you to process a readable stream line by line.
var fs = require("fs"),
readline = require("readline");
var file = "something.csv";
var rl = readline.createInterface({
input: fs.createReadStream(file),
output: null,
terminal: false
})
rl.on("line", function(line) {
console.log("Got line: " + line);
});
rl.on("close", function() {
console.log("All data processed.");
});
I think the module 'split' by dominic tarr will suffice.
It breaks up the stream line by line.
https://npmjs.org/package/split
fs.createReadStream(file)
.pipe(split())
.on('data', function (line) {
//each chunk now is a seperate line!
})

How Nodejs insert a string to the n-th line of file easily?

I googled around and checked a few npm (e.g. Lazy), but still couldn't find a good pattern to insert a string to n-th number of line of a file.
Being a newbie to Nodejs, I suppose this could be done easily as in other languages, e.g. PHP / Ruby.
Thanks for your solution in advance.
What you can do is:
Open a file in read mode
`var fileData = fs.createReadStream('filename.extension');'
Read line-by-line and track a counter
Check this counter with your desired n-th line number
If matched: append the line fileData.write("this is a message"); by opening file in append mode and traversing to the counter.
If didn't match: print "No such position found. Error!"
I'd probably use one of 'given input stream, notify me on each line' modules, for example node-lazy or byline:
var fs = require('fs'),
byline = require('byline');
var stream = byline(fs.createReadStream('sample.txt'));
stream.on('line', function(line) {
// do stuff with line
});
stream.pipe(fs.createWriteStream('./output');
If your file is small, you can simply read all of the file synchronously and split the result string like this:
require('fs').readFileSync('abc.txt').toString().split('\n').forEach(function (line) { line; })[1]
Another way:
Line-by-line npm
var LineByLineReader = require('line-by-line'),
var lr = new LineByLineReader('big_file.txt');
lr.on('error', function (err) {
// 'err' contains error object
});
lr.on('line', function (line) {
// pause emitting of lines...
lr.pause();
// ...do your asynchronous line processing..
setTimeout(function () {
// ...and continue emitting lines.
lr.resume();
}, 100);
});
lr.on('end', function () {
// All lines are read, file is closed now.
});
Your node-lazy way:
var lazy = require("lazy"),
fs = require("fs");
var matched_line_number = 10;// let say 10, can be any
new lazy(fs.createReadStream('./MyVeryBigFile.extension'))
.lines
.forEach(function(line){
console.log(line.toString());
ctr++;
}
);
Another way could be:
var fs = require('fs'),
async = require('async'),
carrier = require('carrier');
async.parallel({
input: fs.openFile.bind(null, './input.txt', 'r'),
output: fs.openFile.bind(null, './output.txt', 'a')
}, function (err, result) {
if (err) {
console.log("An error occured: " + err);
return;
}
carrier.carry(result.input)
.on('line', result.output.write)
.on('end', function () {
result.output.end();
console.log("Done");
});
});
Open you file in read mode and line-by-line check for the desired line and simultaneously write it to another file with manipulating your lines.

Reading only modified data

In my project, i am using fs.watchFile for listening to the modification of the text file.
Requirement
Read only the last updated data
Note In text file data is always added, no deletion.
Sample code
fs.watchFile(config.filePath, function (curr, prev) {
fs.readFile(config.filePath, function (err, data) {
if (err) throw err;
console.log(data);
});
});
Above code reads whole text file when the file is modified.
Any suggestion will be greatful.
Working Code
fs.watchFile(config.filePath, function (curr, prev) {
var filestream = fs.createReadStream(config.filePath,{start:prev.size,end:curr.size,encoding:"utf-8");
filestream.on('data', function (data) {
console.log(data);
});
});
You can work with the stat object of the file. The curr and the prev object are both Stats objects and have an attribute called "size".
I assume you are always adding data to the beginning or the end of the file, otherwise there is no way in knowing where the data was added.
The difference between prev.size and curr.size tells you how many bytes were added. By give the readFile-Function an options object with a start and an end attribute.
For example: You always add to the end, then you can make such a call:
fs.readFile(config.filePath, {start: prev.size}, function ...);
If you add in the beginning:
fs.readFile(config.filePath, {start: 0, end: (curr.size-prev.size)}, function ...);
Hope that helps!

Resources