How to use Node.js streams to append to end of file? - node.js

I have a node.js app, which opens a stream:
outputStream = fs.createWriteStream("output.txt");
I then, asynchronously, add text to the file:
outputStream.write( outputTxt, "utf8" );
This code is being run inside a loop, so it happens hundreds of times. However, the loop is asynchronous, so it sometimes pauses, and I can edit the output.txt file in an external editor in the meantime, and (for example) add a few chars in the beginning.
However, when I do that, the next time the outputStream.write is executed, it overwrites the last few chars previously added (the same number of chars that I added externally).
Is there some way to prevent this? Some way to tell the writeStream find the end of the file and then add the text?

Related

Excel VBA out of memory when reading file to string on subsequent runs. Works the first time, but not the second

I'm working on a project where I'm harvesting data from large text files, some 200MB up to 1.5GB. At the moment I'm testing this with a 200MB sized file. If I use the File System Object and read the files line-by-line the process takes over an hour. So in an attempt to speed things up I found that reading the files into a long String then splitting on specific headers, then in turn splitting each index of that array by lines and iterating over line-by-line decreased the total read time to about a minute. Huge performance boost! Problem is the process only works the first time it is run. Any subsequent runs cause an 'Run-time error '14': out of string space' error. The fix is to close the workbook then reopen it. Then it will work again, but just once. To circumvent that issue I created an interim method that breaks down the original file into multiple files of roughly 50MB each. Then I iterate over each of the smaller files and in turn split those on headers, and so on... Now my interim process, the one that breaks down the original file, only works the first time. Here is the code:
entire_file = FreeFile ' .................................................. get the next FreeFile number
Open orig_path For Input As #entire_file ' ................................ open file for reading
contents = Input$(LOF(entire_file), entire_file) ' ........................ read entire file into a String
Close #entire_file ' ...................................................... close the input file, no longer needed
blocks = Split(contents, "PAGE 10000") ' .................................. split on the PAGE number
contents = "" ' ........................................................... free up memory
The error happens on the 'blocks = split" line, however, hovering the mouse over the variable 'contents' shows (Out of memory), so I'm not sure the problem is with the 'blocks=split' line.
I close every opened file. I clear all String variables. I set all Class objects to Nothing. Yet this memory issue still persists. I just cannot figure out a way to clear the memory so that the method can be run more than once. Did I miss any other pertinent data that might help diagnose this?
I found a few other posts where someone encountered the same issue, but there was no resolution to the problem, or at least the thread was closed before a solution was posted. Many thanks in advance.

Attempting to append all content into file, last iteration is the only one filling text document

I'm trying to Create a file and append all the content being calculated into that file, but when I run the script the very last iteration is written inside the file and nothing else.
My code is on pastebin, it's too long, and I feel like you would have to see exactly how the iteration is happening.
Try to summarize it, Go through an array of model numbers, if the model number matches call the function that calculates that MAC_ADDRESS, when done calculating store all the content inside a the file.
I have tried two possible routes and both have failed, giving the same result. There is no error in the code (it runs) but it just doesn't store the content into the file properly there should be 97 different APs and it's storing only 1.
The difference between the first and second attempt,
1 attempt) I open/create file in the beginning of the script and close at the very end.
2 attempt) I open/create file and close per-iteration.
First Attempt:
https://pastebin.com/jCpLGMCK
#Beginning of code
File = open("All_Possibilities.txt", "a+")
#End of code
File.close()
Second Attempt:
https://pastebin.com/cVrXQaAT
#Per function
File = open("All_Possibilities.txt", "a+")
#per function
File.close()
If I'm not suppose to reference other websites, please let me know and I'll just paste the code in his post.
Rather than close(), please use with:
with open('All_Possibilities.txt', 'a') as file_out:
file_out.write('some text\n')
The documentation explains that you don't need + to append writes to a file.
You may want to add some debugging console print() statements, or use a debugger like pdb, to verify that the write() statement actually ran, and that the variable you were writing actually contained the text you thought it did.
You have several loops that could be a one-liner using readlines().
Please do this:
$ pip install flake8
$ flake8 *.py
That is, please run the flake8 lint utility against your source code,
and follow the advice that it offers you.
In particular, it would be much better to name your identifier file than to name it File.
The initial capital letter means something to humans reading your code -- it is
used when naming classes, rather than local variables. Good luck!

tail -f implementation in node.js

I have created an implementation of tail -f in node.js using socket.io and fs.watch function.
I read the file using fs.readFile, convert it into array of lines and returns it to the client. Stores the current length in variable.
Then whenever the "file changed" event fires, I re-read the whole file, converts it into array of lines. And then compare the old length and current length. and slice it like
fileContent.slice(oldLength, fileContent.length)
this gives me the changed content. So running perfectly fine.
Problem: I am reading the whole file every time the file gets changed, which is not efficient if file is too large. So is there any way, of reading a file once, and then gets the changed content if there is any change?
I have also tried, spawning child process for "tail -f"
var spawn = require ('child_process').spawn;
var child = spawn ('tail', ['-f', logfile]);
child.stdout.on ('data', function (data){
linesArray = data.toString().split("\n")
console.log ( "Data sent" + linesArray[0]);
io.emit('changed', {
data: linesArray,
});
});
the problem with this is:
on("data") event fires multiple time when I save the logfile by writing some content.
On first load, it correctly returns the last ten line of the file. But if there is a change then it return the whole content again and again.
So if you have any idea of solving this problem let me know. Till then I will dig the internet.
So, I got the solution by reading someone else's code. So solution was to use fs.open which will open the file and then instead of reading whole file we can read the particular block from the file using fs.read() function.
To know about the fs.open/fs.read, read this nodejs-file-system.
Official doc : fs.read

Nodejs line-by-line close calling next line?

I'm using the plugin line-by-line to read a very large file. There's a case where I want just the first line, so I'd close the connection immediately in that case.
However, I was noticing that it would try to process the next line regardless. I dumbed it down as far as I could, and wrote this:
lr.on("line", function (line) {
lr.pause();
console.log("\rLine");
lr.close();
}
My console shows:
Line
Line
Without the lr.close(), it only logs Line once.
What am I missing?
Take a look to Source
It print last lineFragment for you.
If you don't need it - call lr.end();.

Why Does Last Line of VB6 Text File Being Read/Written to Another File Print Only Partially?

I am creating several text folders programmatically using VB6, and then concatenating them all together into a single file.
I write text to the files using
Print #lngFileHandle, Text
so there should be a CR/LF even after the very last line of text in each file.
Then I append all these "subfiles" together into another text file that was opened this way:
Open strFileName For Append As #lngFileHandle
Strangely, my final resulting file looks good EXCEPT that the very last line of the last file being appended is only partially there.
The last few lines look like this in the file I'm reading FROM:
`<Name>` Referral for Service Home Delivered Meals`</Name>`
`<Name>` Referral for Service Adult Day Care/Health`</Name>`
`<Name>` Referral for Service Congregate Meals`</Name>`
but after being read in from that file and output to the final file, they look like this:
`<Name>` Referral for Service Home Delivered Meals`</Name>`
`<Name>` Referral for Service Adult Day Care/Health`</Name>`
`<Name>` Referral for Service Congr
The code I'm using to read in this particular "subfile" and output it to the final file is:
With mobjNewEntriesLog
Do While Not .IsEOF
strOutput = .ReadLine
mobjMainLog.PrintLine strOutput
Loop
End With
The .IsEOF function is as follows:
Public Function IsEOF() As Boolean
If blnOpened Then
IsEOF = EOF(lngFileHandle)
Else
IsEOF = True
End If
End Function
It would make more sense to me if I wasn't getting the last line at ALL, but getting just PART of it?--I don't get that.
Anybody see anything that would make the last line only print partially to the final file?
TIA.
Ensure you are closing your file as this may be required to flush out any data that is pending to be written.
VB6 file numbers are not file handles, so don't call them that. They are indexes into a file descriptor table in the runtime where the actual handle, mode, buffer length, buffer, ponters, etc. are stored.
The Close statement is not synchronous, but a "lazy close" that may not have flushed all data and updated the EOF pointer of the file by the time you turn around and try to read it again. This behavior is intentional as far as I can determine, perhaps for performance reasons.
A Reset statement can be used to force all open files closed, and it is synchronous. This isn't always practical, however it may be fine in your case. Easy enough to try: add a Reset before you re-open any of your files to concatenate them.

Resources