Is this terminal-log the consequence of the Node JS asynchronous nature? - node.js

I haven't found anything specific about this, it isn't really a problem but I would like to understand better what is going on here.
Basically, I'am testing some simple NodeJS code , like this :
//Summary : Open a file , write to the file, delete the file.
let fs = require('fs');
fs.open('mynewfile.txt' , 'w' , function(err,file){
if(err) throw err;
console.log('Created file!')
})
fs.appendFile('mynewfile.txt' , 'Depois de ter criado este ficheiro com o fs.open, acrescentei-lhe data com o fs.appendFile' , function(err){
if (err) throw err;
console.log('Added text to the file.')
})
fs.unlink('mynewfile.txt', function(err){
if (err) throw err;
console.log('File deleted!')
})
console.log(__dirname);
I thought this code would be executed in the order it was written from the top to the bottom, but when I look at the terminal I'am not sure that was the case because this is what I get :
$ node FileSystem.js
C:\Users\Simon\OneDrive\Desktop\Portfolio\Learning Projects\NodeJS_Tutorial
Created file!
File deleted!
Added text to the file.
//Expected order would be: Create file, add text to file , delete file , log dirname.
Instead of what ther terminal might make you think, in the end when I look at my folder the code order still seems to have been followed somehow because the file was deleted and I have nothing left on the directory.
So , I was wondering , why is it that the terminal doesn't log in the same order that the code is written from the top to the bottom.
Would this be the result of NodeJS asynchronous nature or is it something else ?

The code is (in princliple) executed from top to bottom, as you say. But fs.open, fs.appendFile, and fs.unlink are asynchronous. Ie, they are placed on the execution stack in the partiticular order, but there is no guarantee whatsoever, in which order they are finished, and thus you can't guarantee, in which order the callbacks are executed. If you run the code multiple times, there is a good chance, that you may encounter different execution orders ...
If you need a specific order, you have two different options
You call the later operation only in the callback of the prior, ie something like below
fs.open('mynewfile.txt' , 'w' , function(err,file){
if(err) throw err;
console.log('Created file!')
fs.appendFile('mynewfile.txt' , '...' , function(err){
if (err) throw err;
console.log('Added text to the file.')
fs.unlink('mynewfile.txt', function(err){
if (err) throw err;
console.log('File deleted!')
})
})
})
You see, that code gets quite ugly and hard to read with all that increasing nesting ...
You switch to the promised based approach
let fs = require('fs').promises;
fs.open("myfile.txt", "w")
.then(file=> {
return fs.appendFile("myfile.txt", "...");
})
.then(res => {
return fs.unlink("myfile");
})
.catch(e => {
console.log(e);
})
With the promise-version of the operations, you can also use async/await
async function doit() {
let file = await fs.open('myfile.txt', 'w');
await fs.appendFile('myfile.txt', '...');
await fs.unlink('myfile.txt', '...');
}
For all three possibilites, you probably need to close the file, before you can unlink it.
For more details please read about Promises, async/await and the Execution Stack in Javascript

It's a combination of 2 things:
The asynchronous nature of Node.js, as you correctly assume
Being able to unlink an open file
What likely happened is this:
The file was opened and created at the same time (open with flag w)
The file was opened a second time for appending (fs.appendFile)
The file was unlinked
Data was appended to the file (while it was already unlinked) and the file was closed
When data was being appended, the file still existed on disk as an inode, but had zero hard links (references) to it. It still takes up space then, but the OS checks the reference count when closing and frees up the space if the count has fallen to zero.
People sometimes run into a similar situation with daemons such as HTTP servers that employ log rotation: if something goes wrong when switching over logs, the old log file may be unlinked but not closed, so it's never cleaned up and it takes space forever (until you reboot or restart the process).
Note that the ordering of operations that you're observing is random, and it is possible that they would be re-ordered. Don't rely on it.

You could write this as (untested):
let fs = require('fs');
const main = async () => {
await fs.open('mynewfile.txt' , 'w');
await fs.appendFile('mynewfile.txt' , 'content');
await fs.unlink('mynewfile.txt');
});
main()
.then(() => console.log('success'()
.catch(console.error);
or within another async function:
const someOtherFn = async () => {
try{
await main();
} catch(e) {
// handle any rejection to your liking
}
}
(The catch block is not mandatory. You can opt to just let them throw to the top. It's just to showcase how async / await allows you to make synchronous code appear as if it was synchronous code without runing into callback hell.)

Related

How do I make the node compiler work with these two operations of fs?

fs.rename("${nombreHtml}.html",(err)=>{
console.log(err)
})
fs.appendFileSync("${nombreHtml}.html", htmlSeparado, () => { })
I try to run these two operations but it doesn't want to work
fs.rename is an asyncronous task.
By the time fs.rename finished its execution, fs.appendFileSync has already tried appending data to an html file which has not existed by the time.
fs.rename ... awaiting callback
fs.append ... failing
fs.rename finished, file now has a new name.
You probably want to either place fs.appendFileSync inside the fs.rename callback, or switch to promises. (example at the bottom)
example that should work:
fs.rename("${nombreHtml}.html",(err)=>{
if (err) console.log(err)
else {
fs.appendFileSync("${nombreHtml}.html", htmlSeparado, () => { })
}
})
By the way, because syncronous functions block the event loop and hence freeze your server for the time handling that function, making it unavailable for any other request - using filesystem's syncronous functions is rather less recommended for the general usecase, as the read/write/append operations are rather long. it is recommended to use the async versions of them, which return a callback or a promise, as you have done using fs.rename.
fs has a built-in sub-module with the same functions as promises which can be accessed by require('fs').promises.
this way you could just
const { rename, appendFile } = require('fs').promises;
try {
await rename("${nombreHtml}.html");
await appendFile("${nombreHtml}.html", htmlSeparado);
} catch (error) {
console.log(error);
}
I assume you want a template string so that the variables insert themselves into the string:
fs.rename(`${nombreHtml}.html`,(err)=>{
console.log(err)
})
fs.appendFileSync(`${nombreHtml}.html`, htmlSeparado, () => { })

Complex sequencing of promises - nested

After a lot of googling I have not been able to confirm the correct approach to this problem. The following code runs as expected but I have a grave feeling that I am not approaching this in the correct way, and I am setting myself up for problems.
The following code is initiated by the main app.js file and is passed a location to start loading XML files from and processing into a mongoDB
exports.processProfiles = function(path) {
var deferrer = q.defer();
q(dataService.deleteProfiles()) // simple mongodb call to empty the Profiles collection
.then(function(deleteResult) {
return loadFilenames(path); // method to load all filenames in the given path using fs
})
.then(function(filenames) {
// now we have all the file names lets load and save
filenames.forEach(function(filename) {
// Here is where i think the problem is!
// kick off another promise chain for the dynamically sized array of files to process
q(loadFileContent(path, filename)) // first we load the data in the file
.then(function(inboundFile) {
// then parse XML structure to my new shiny JSON structure
// and ask Mongo to store it for me
return dataService.createProfile(processProfileXML(filename, inboundFile));
})
.done(function(result) {
console.log(result);
})
});
})
.catch(function(err) {
deferrer.reject('Unable to Process Profile records : ' + err);
})
.done(function() {
deferrer.resolve('Profile Processing Completed');
});
return deferrer.promise;
}
Whilst this code works these are my main concerns but cannot solve them on my own after a few hours of Google and reading.
1) Is this blocking? The read out to the console is difficult to understand if this is running asynchronously as i want it to - i think it is but advice on if I am doing something fundamentally wrong would be great
2) Is having a nested promise a bad idea, should I be linking it to the outter promise - I have tried but could not get anything to compile or run.
I haven't used Q in a really long time, but I think that you'd need to do is let it know you're about to hand back an array of promises that need to all be satisfied before moving on.
Additionally as you're waiting for multiple promises on one section of code, rather than nesting further, throw the 'set' of promises back up once they're all satisfied.
q(dataService.deleteProfiles()) // simple mongodb call to empty the Profiles collection
.then(function (deleteResult) {
return loadFilenames(path); // method to load all filenames in the given path using fs
})
.then(function (filenames) {
return q.all(
filenames.map(function (filename) {
return q(loadFileContent(path, filename)) { /* Do stuff with your filenames */ });
})
);
.then(function (resultsOfLoadFileContentsPromises) {
console.log('I did stuff with all the things');
)
.catch(function(err) {});
What you have is not 'blocking'. But really what you're doing with promises is moving things into a new 'block'ing section. The more blocks you have, the more async-ish your code will appear. If nothing else is running apart from this promise, it will still appear procedural.
But inner promises must still resolve before the parent promises resolve thereafter.
Inner promises like what you have aren't an inherently bad, personally I will break them out into seperate files to makes easier to reason about, but I wouldn't define that as 'bad' unless there's no need for that inner promise to exist, however where possible (and in your example here) I've adjusted so I throw back up the next set of promises for a new section to deal with the data after it's gotten it.
(I'm not great with Q though, this code will probably require a little further tweaking).

node.js file system problems

I keep banging my head against the wall because of tons of different errors. This is what the code i try to use :
fs.readFile("balance.txt", function (err, data) //At the beginning of the script (checked, it works)
{
if (err) throw err;
balance=JSON.parse(data);;
});
fs.readFile("pick.txt", function (err, data)
{
if (err) throw err;
pick=JSON.parse(data);;
});
/*....
.... balance and pick are modified
....*/
if (shutdown)
{
fs.writeFile("balance2.txt", JSON.stringify(balance));
fs.writeFile("pick2.txt", JSON.stringify(pick));
process.exit(0);
}
At the end of the script, the files have not been modified the slightest. I then found out on this site that the files were being opened 2 times simultaneously, or something like that, so i tried this :
var balance, pick;
var stream = fs.createReadStream("balance.txt");
stream.on("readable", function()
{
balance = JSON.parse(stream.read());
});
var stream2 = fs.createReadStream("pick.txt");
stream2.on("readable", function()
{
pick = JSON.parse(stream2.read());
});
/****
****/
fs.unlink("pick.txt");
fs.unlink("balance.txt");
var stream = fs.createWriteStream("balance.txt", {flags: 'w'});
var stream2 = fs.createWriteStream("pick.txt", {flags: 'w'});
stream.write(JSON.stringify(balance));
stream2.write(JSON.stringify(pick));
process.exit(0);
But, this time, both files are empty... I know i should catch errors, but i just don't see where the problem is. I don't mind storing the 2 objects in the same file, if that can helps. Besides that, I never did any javascript in my life before yesterday, so, please give me a simple explanation if you know what failed here.
What I think you want to do is use readFileSync and not use readFile to read your files since you need them to be read before doing anything else in your program (http://nodejs.org/api/fs.html#fs_fs_readfilesync_filename_options).
This will make sure you have read both the files before you execute any of the rest of your code.
Make your like code do this:
try
{
balance = JSON.parse(fs.readFileSync("balance.txt"));
pick = JSON.parse(fs.readFileSync("pick.txt"));
}
catch(err)
{ throw err; }
I think you will get the functionality you are looking for by doing this.
Note, you will not be able to check for an error in the same way you can with readFile. Instead you will need to wrap each call in a try catch or use existsSync before each operation to make sure you aren't trying to read a file that doesn't exist.
How to capture no file for fs.readFileSync()?
Furthermore, you have the same problem on the writes. You are kicking off async writes and then immediately calling process.exit(0). A better way to do this would be to either write them sequentially asynchronously and then exit or to write them sequentially synchronously then exit.
Async option:
if (shutdown)
{
fs.writeFile("balance2.txt", JSON.stringify(balance), function(err){
fs.writeFile("pick2.txt", JSON.stringify(pick), function(err){
process.exit(0);
});
});
}
Sync option:
if (shutdown)
{
fs.writeFileSync("balance2.txt", JSON.stringify(balance));
fs.writeFileSync("pick2.txt", JSON.stringify(pick));
process.exit(0);
}

Creating a file only if it doesn't exist in Node.js

We have a buffer we'd like to write to a file. If the file already exists, we need to increment an index on it, and try again. Is there a way to create a file only if it doesn't exist, or should I just stat files until I get an error to find one that doesn't exist already?
For example, I have files a_1.jpg and a_2.jpg. I'd like my method to try creating a_1.jpg and a_2.jpg, and fail, and finally successfully create a_3.jpg.
The ideal method would look something like this:
fs.writeFile(path, data, { overwrite: false }, function (err) {
if (err) throw err;
console.log('It\'s saved!');
});
or like this:
fs.createWriteStream(path, { overwrite: false });
Does anything like this exist in node's fs library?
EDIT: My question isn't if there's a separate function that checks for existence. It's this: is there a way to create a file if it doesn't exist, in a single file system call?
As your intuition correctly guessed, the naive solution with a pair of exists / writeFile calls is wrong. Asynchronous code runs in unpredictable ways. And in given case it is
Is there a file a.txt? — No.
(File a.txt gets created by another program)
Write to a.txt if it's possible. — Okay.
But yes, we can do that in a single call. We're working with file system so it's a good idea to read developer manual on fs. And hey, here's an interesting part.
'w' - Open file for writing. The file is created (if it does not
exist) or truncated (if it exists).
'wx' - Like 'w' but fails if path exists.
So all we have to do is just add wx to the fs.open call. But hey, we don't like fopen-like IO. Let's read on fs.writeFile a bit more.
fs.readFile(filename[, options], callback)#
filename String
options Object
encoding String | Null default = null
flag String default = 'r'
callback Function
That options.flag looks promising. So we try
fs.writeFile(path, data, { flag: 'wx' }, function (err) {
if (err) throw err;
console.log("It's saved!");
});
And it works perfectly for a single write. I guess this code will fail in some more bizarre ways yet if you try to solve your task with it. You have an atomary "check for a_#.jpg existence, and write there if it's empty" operation, but all the other fs state is not locked, and a_1.jpg file may spontaneously disappear while you're already checking a_5.jpg. Most* file systems are no ACID databases, and the fact that you're able to do at least some atomic operations is miraculous. It's very likely that wx code won't work on some platform. So for the sake of your sanity, use database, finally.
Some more info for the suffering
Imagine we're writing something like memoize-fs that caches results of function calls to the file system to save us some network/cpu time. Could we open the file for reading if it exists, and for writing if it doesn't, all in the single call? Let's take a funny look on those flags. After a while of mental exercises we can see that a+ does what we want: if the file doesn't exist, it creates one and opens it both for reading and writing, and if the file exists it does so without clearing the file (as w+ would). But now we cannot use it neither in (smth)File, nor in create(Smth)Stream functions. And that seems like a missing feature.
So feel free to file it as a feature request (or even a bug) to Node.js github, as lack of atomic asynchronous file system API is a drawback of Node. Though don't expect changes any time soon.
Edit. I would like to link to articles by Linus and by Dan Luu on why exactly you don't want to do anything smart with your fs calls, because the claim was left mostly not based on anything.
What about using the a option?
According to the docs:
'a+' - Open file for reading and appending. The file is created if it does not exist.
It seems to work perfectly with createWriteStream
This method is no longer recommended. fs.exists is deprecated. See comments.
Here are some options:
1) Have 2 "fs" calls. The first one is the "fs.exists" call, and the second is "fs.write / read, etc"
//checks if the file exists.
//If it does, it just calls back.
//If it doesn't, then the file is created.
function checkForFile(fileName,callback)
{
fs.exists(fileName, function (exists) {
if(exists)
{
callback();
}else
{
fs.writeFile(fileName, {flag: 'wx'}, function (err, data)
{
callback();
})
}
});
}
function writeToFile()
{
checkForFile("file.dat",function()
{
//It is now safe to write/read to file.dat
fs.readFile("file.dat", function (err,data)
{
//do stuff
});
});
}
2) Or Create an empty file first:
--- Sync:
//If you want to force the file to be empty then you want to use the 'w' flag:
var fd = fs.openSync(filepath, 'w');
//That will truncate the file if it exists and create it if it doesn't.
//Wrap it in an fs.closeSync call if you don't need the file descriptor it returns.
fs.closeSync(fs.openSync(filepath, 'w'));
--- ASync:
var fs = require("fs");
fs.open(path, "wx", function (err, fd) {
// handle error
fs.close(fd, function (err) {
// handle error
});
});
3) Or use "touch": https://github.com/isaacs/node-touch
Todo this in a single system call you can use the fs-extra npm module.
After this the file will have been created as well as the directory it is to be placed in.
const fs = require('fs-extra');
const file = '/tmp/this/path/does/not/exist/file.txt'
fs.ensureFile(file, err => {
console.log(err) // => null
});
Another way is to use ensureFileSync which will do the same thing but synchronous.
const fs = require('fs-extra');
const file = '/tmp/this/path/does/not/exist/file.txt'
fs.ensureFileSync(file)
With async / await and Typescript I would do:
import * as fs from 'fs'
async function upsertFile(name: string) {
try {
// try to read file
await fs.promises.readFile(name)
} catch (error) {
// create empty file, because it wasn't found
await fs.promises.writeFile(name, '')
}
}
Here's a synchronous way of doing it:
try {
await fs.truncateSync(filepath, 0);
} catch (err) {
await fs.writeFileSync(filepath, "", { flag: "wx" });
}
If the file exists it will get truncated, otherwise it gets created if an error is raised.
This works for me.
// Use the file system fs promises
const {access} = require('fs/promises');
// File Exist returns true
// dont use exists which is no more!
const fexists =async (path)=> {
try {
await access(path);
return true;
} catch {
return false;
}
}
// Wrapper for your main program
async function mainapp(){
if( await fexists("./users.json")){
console.log("File is here");
} else {
console.log("File not here -so make one");
}
}
// run your program
mainapp();
Just keep eye on your async - awaits so everthing plays nice.
hope this helps.
You can do something like this:
function writeFile(i){
var i = i || 0;
var fileName = 'a_' + i + '.jpg';
fs.exists(fileName, function (exists) {
if(exists){
writeFile(++i);
} else {
fs.writeFile(fileName);
}
});
}

nodejs express fs iterating files into array or object failing

So Im trying to use the nodejs express FS module to iterate a directory in my app, store each filename in an array, which I can pass to my express view and iterate through the list, but Im struggling to do so. When I do a console.log within the files.forEach function loop, its printing the filename just fine, but as soon as I try to do anything such as:
var myfiles = [];
var fs = require('fs');
fs.readdir('./myfiles/', function (err, files) { if (err) throw err;
files.forEach( function (file) {
myfiles.push(file);
});
});
console.log(myfiles);
it fails, just logs an empty object. So Im not sure exactly what is going on, I think it has to do with callback functions, but if someone could walk me through what Im doing wrong, and why its not working, (and how to make it work), it would be much appreciated.
The myfiles array is empty because the callback hasn't been called before you call console.log().
You'll need to do something like:
var fs = require('fs');
fs.readdir('./myfiles/',function(err,files){
if(err) throw err;
files.forEach(function(file){
// do something with each file HERE!
});
});
// because trying to do something with files here won't work because
// the callback hasn't fired yet.
Remember, everything in node happens at the same time, in the sense that, unless you're doing your processing inside your callbacks, you cannot guarantee asynchronous functions have completed yet.
One way around this problem for you would be to use an EventEmitter:
var fs=require('fs'),
EventEmitter=require('events').EventEmitter,
filesEE=new EventEmitter(),
myfiles=[];
// this event will be called when all files have been added to myfiles
filesEE.on('files_ready',function(){
console.dir(myfiles);
});
// read all files from current directory
fs.readdir('.',function(err,files){
if(err) throw err;
files.forEach(function(file){
myfiles.push(file);
});
filesEE.emit('files_ready'); // trigger files_ready event
});
As several have mentioned, you are using an async method, so you have a nondeterministic execution path.
However, there is an easy way around this. Simply use the Sync version of the method:
var myfiles = [];
var fs = require('fs');
var arrayOfFiles = fs.readdirSync('./myfiles/');
//Yes, the following is not super-smart, but you might want to process the files. This is how:
arrayOfFiles.forEach( function (file) {
myfiles.push(file);
});
console.log(myfiles);
That should work as you want. However, using sync statements is not good, so you should not do it unless it is vitally important for it to be sync.
Read more here: fs.readdirSync
fs.readdir is asynchronous (as with many operations in node.js). This means that the console.log line is going to run before readdir has a chance to call the function passed to it.
You need to either:
Put the console.log line within the callback function given to readdir, i.e:
fs.readdir('./myfiles/', function (err, files) { if (err) throw err;
files.forEach( function (file) {
myfiles.push(file);
});
console.log(myfiles);
});
Or simply perform some action with each file inside the forEach.
I think it has to do with callback functions,
Exactly.
fs.readdir makes an asynchronous request to the file system for that information, and calls the callback at some later time with the results.
So function (err, files) { ... } doesn't run immediately, but console.log(myfiles) does.
At some later point in time, myfiles will contain the desired information.
You should note BTW that files is already an Array, so there is really no point in manually appending each element to some other blank array. If the idea is to put together the results from several calls, then use .concat; if you just want to get the data once, then you can just assign myfiles = files directly.
Overall, you really ought to read up on "Continuation-passing style".
I faced the same problem, and basing on answers given in this post I've solved it with Promises, that seem to be of perfect use in this situation:
router.get('/', (req, res) => {
var viewBag = {}; // It's just my little habit from .NET MVC ;)
var readFiles = new Promise((resolve, reject) => {
fs.readdir('./myfiles/',(err,files) => {
if(err) {
reject(err);
} else {
resolve(files);
}
});
});
// showcase just in case you will need to implement more async operations before route will response
var anotherPromise = new Promise((resolve, reject) => {
doAsyncStuff((err, anotherResult) => {
if(err) {
reject(err);
} else {
resolve(anotherResult);
}
});
});
Promise.all([readFiles, anotherPromise]).then((values) => {
viewBag.files = values[0];
viewBag.otherStuff = values[1];
console.log(viewBag.files); // logs e.g. [ 'file.txt' ]
res.render('your_view', viewBag);
}).catch((errors) => {
res.render('your_view',{errors:errors}); // you can use 'errors' property to render errors in view or implement different error handling schema
});
});
Note: you don't have to push found files into new array because you already get an array from fs.readdir()'c callback. According to node docs:
The callback gets two arguments (err, files) where files is an array
of the names of the files in the directory excluding '.' and '..'.
I belive this is very elegant and handy solution, and most of all - it doesn't require you to bring in and handle new modules to your script.

Resources