I use simple script for writing in file between node.js process:
var stream = fs.createWriteStream('test');
stream.once('open', function(fd) {
stream.write(progress_percents);
});
In log I see error "to many open file". How to write to file only if this file not using other processes ?
You've diagnosed the problem incorrectly. The too many open files error means that too many files are opened by this process.
Related
I am using Node.Js filesystem to write data to .csv file. If I open the file when the write operation is still incomplete, it's causing the loss of data because fs is unable to write when the file is open. So is there any way to open the file in read-only mode if the write operation is going on?
the issue has been resolved when I used fs.createWriteStream for the file operation.
I have an application that streams data to a file, can I use Node.js to read the file while it's being streamed to?
I tried using createReadStrem, but it only read one chunk and the stream ended
You could try watching for file changes with fs.watchFile(filename[, options], listener) or node-watch. In file change you could just read last line with read-last-lines.
Although I'm not sure how efficient it would be.
I'm trying to read the contents of some .gz log files using streams in node.
I've started simply with: fs.createReadStream('log.gz').pipe(zlib.createUnzip().
This works and I can pipe to process.stdout to verify. I'd like to pipe to this a new writeableStream, that I can have a data event to actually work with the contents. I guess I just don't fully understand how the streams work. I tried just creating a new writable stream, var writer = fs.createWriteStream() but this doesn't work because it requires a path.
Any ideas how I can go about doing this (without creating any other files to write to)?
var unzipStream = zlib.createUnzip()
unzipStream.on('data', myDataHandler).on('end', myEndHandler)
fs.createReadStream('log.gz').pipe(unzipStream)
That will get you the data and end events. Since it's log data you may also find split useful to get events line by line.
I have an application that will log data to a text file anytime the application is used. It may be used once a day or maybe once every second.
Is there anything wrong with me opening a handle to the text file and keeping it open as long as my application is running and never close it? This way I can just append data to the log without reopening the file handle. Are there consequences to this decision I should be aware of?
fs.open returns a file descriptor and if you don't close connection it may result in file descriptors leak. The file descriptor won't be closed, even if there's an error.
on the other hand, fs.readFile, fs.writeFile,fs.appendFile , fs.createReadStream and fs.createWriteStream don't return a file descriptor. They open the file , operate on it and then close the file.
My node.js app runs a function every second to (recursively) read a directory tree for .json files. These files are uploaded to the server via FTP from clients, and are placed in the folder that the node script is running on.
What I've found (at least what I think is happening), is that node is not waiting for the .json file to be fully written before trying to read it, and as such, is throwing a 'Unexpected end of input' error. It seems as though the filesystem needs a few seconds (milliseconds maybe) to write the file properly. This could also have something to do with the file being written from FTP (overheads possibly, I'm totally guessing here...)
Is there a way that we can wait for the file to be fully written to the filesytem before trying to read it with node?
fs.readFile(file, 'utf8', function(err, data) {
var json = JSON.parse(data); // throws error
});
You can check to see if the file is still growing with this:
https://github.com/felixge/node-growing-file