I'm trying to loop over a bunch of directories and then try if a file inside that directory exists with NodeJS and fs.stat().
I've got a simple for-loop to loop over the directories and in it the fs.stat() call to check, if "project.xml" inside that particular directory exists. My code looks like this:
for (var i = 0, length = prDirs.length; i < length; i++) {
var path = Config["ProjectDirectory"] + "/" + prDirs[i];
console.log("PATH=" + path);
fs.stat(path + "/project.xml", function (err, stat) {
if (err == null) {
console.log(" => PATH=" + path);
}
})
}
NodeJS loops correctly over the directory, the console.log() outputs all the directories correctly, but the code inside the if inside fs.stat() is not called and runs only once at the end of the loop. My console shows this:
PATH=(...)/PHP
PATH=(...)/Electron
PATH=(...)/testapp
PATH=(...)/Vala
=> PATH=(...)/Vala/project.xml
But the project.xml I'm looking for is in testapp/ not in Vala/ but Vala/ is the last entry in prDirs.
The code above is my latest attempt, I've tried plenty of other variations and one (I appended an else to the if inside fs.stat()) showed me, that fs.stat() actually gets invoked, but only the code inside the if is not running and the code in the else I appended once was running.
Thanks in advance!
fs.stat is an asynchronous i/o function, so its callback will be called only after the main thread is idle, or in your case, only after the for loop is done. Instead of a for loop, I suggest iterating the folder in an asynchronous matter. You can use async.each, async.eachSeries, or implement it yourself.
As #Gil Z mention fs.stat in async. I would suggest to use promises if you want to keep for each loop and make code looks sync.
Here is the example. Works on node v 6.9
"use strict";
const fs = require('fs');
let paths = ["tasks", "aaa", "bbbb"];
//method to get stat data using promises
function checkFileExists(path) {
return new Promise((resolve, reject) => {
fs.stat(path + "/project.xml", (err, res) => {
resolve(err ? "Not found in " + path : " => PATH=" + path);
});
});
}
//create promise array with each directory
let promiseArr = [];
paths.forEach(pathPart => {
let path = process.cwd() + "/" + pathPart;
promiseArr.push(checkFileExists(path));
});
//run all promises and collect results
Promise.all(promiseArr)
.then(reslut => {
console.log(reslut);
})
.catch(e => console.log("Error in promises"));
The above code will log this
[ ' => PATH=/Users/mykolaborysiuk/Sites/circlemedia/syndic-apiManager/tasks',
'Not found in /Users/mykolaborysiuk/Sites/circlemedia/syndic-apiManager/aaa',
'Not found in /Users/mykolaborysiuk/Sites/circlemedia/syndic-apiManager/bbbb' ]
Hope this helps.
Related
So I use this API that helps me turn a .docx file into a .pdf. I placed the code that converts the file into a function. :
function conv(){
convertapi.convert('pdf', { File: final_path })
.then(function(result) {
// get converted file url
console.log("Converted file url: " + result.file.url);
finp = path + file_name.slice(0, file_name.length - 5) + ".pdf";
console.log(finp);
// save to file
return result.file.save(finp);
})
.then(function(file) {
console.log("File saved: " + file);
process.exit(1);
})
.catch(function(e) {
console.log("numele si/sau extensia fisierului sunt gresite");
process.exit(1);
});
}
The code above works only for one file at a time. I made a loop that goes through every file (.docx) in my folder and save its name into an array. I go through every item of the array and call the function :
for(var j = 0; j<=i ; j++){
file_name = toate_nume[j];
final_path = path + file_name;
conv();
}
The file names are stored correctly, but when I run my project, the function is called after the loop itself ends ( is called the correct number of times for each and every file). So if I have 2 files : test1.docx and test2.docx the output shows me that the conv() is called 2 times for the test2.docx, instead of one time for each file. What should I do?
The reason might be this:
The API is slow so your program is executing the loop faster than the API the can handle the requests. So what ends up happening is that you have modified the final_path variable twice before convertapi gets called, and then it gets called twice with the same final_path. Try to modify your conv function so that it accepts a parameter, e.g. path and uses that. Then call conv with the current final_path parameter:
conv(final_path)
And:
function conv(path) {
convertapi.convert('pdf', { File: path })
...
So you are calling n Promise in a serial. And you want to wait for the end?
You can use Promise. all
const toate_nume = ['fileName1', 'fileName2'];
const spawn = toate_nume.map(x => {
const final_path = path + x;
return conv(final_path);
});
Promise.all(spawn).then(results => {
console.log('All operation done successfully %o', results);
});
or use await:
const results = await Promise.all(spawn);
the results is an array, an entry for each call.
NOTE** I pass the path as an argument instead of a global var
I built an Angular/Node app that renames files in network folders. The number of files it renames are between 300 to 500. I use await so I get notified when renaming is done. It takes 8-10minutes per run and it can't rename simultaneously since I am using await.
I need to pass the number of renamed files and I need to show the user that the renaming is already complete. If I don't use async/await, how can my angular front-end know that the renaming is completed?
My full code is in here: https://github.com/ericute/renamer
Here's where I'm having a trouble with:
await walk(folderPath, function(err, results) {
if (err) throw err;
results.forEach(file => {
if (fs.lstatSync(file).isFile) {
fileCounter++;
}
let fileBasename = path.basename(file);
let filePath = path.dirname(file);
if (!filesForRenaming[path.basename(file)]) {
//In a javascript forEach loop,
//return is the equivalent of continue
//https://stackoverflow.com/questions/31399411/go-to-next-iteration-in-javascript-foreach-loop
return;
}
let description = filesForRenaming[path.basename(file)].description;
// Process instances where the absolute file name exceeds 255 characters.
let tempNewName = path.resolve(filePath, description + "_" + fileBasename);
let tempNewNameLength = tempNewName.length;
let newName = '';
if (tempNewNameLength > 255) {
let excess = 254 - tempNewNameLength;
if (description.length > Math.abs(excess)) {
description = description.substring(0, (description.length - Math.abs(excess)));
}
newName = path.resolve(filePath, description + "_" + fileBasename);
} else {
newName = tempNewName;
}
renamedFiles++;
// Actual File Renaming
fs.renameSync(file, newName, (err) => {
if (err) {
errList.push(err);
}
renamedFiles++;
});
});
if (Object.keys(errList).length > 0) {
res.send({"status":"error", "errors": errList});
} else {
res.send({
"status":"success",
"filesFoundInDocData": Object.keys(filesForRenaming).length,
"filesFound": fileCounter,
"renamedFiles": renamedFiles,
"startDate": startDate
});
}
});
If you're using any sync methods you're basically blocking the event loop. You must change the whole structure of your code and start using promises everywhere. You should be able to create another service in angular that checks if the renaming process is completed using timeInterval and GET requests (the most easy way). For example, you could set angular to fetch data from "/isRenameCompleted" and alert the user if the result is true or something. To get real-time results you must switch to socket-io. A quickly solution for 1 client (cause you need to store unique IDs for each request and fetch promises accordingly) could be this one:
1: Create two global variables on top of your code
var filesStatus="waiting"
var pendingFiles=[]
2: Inside your renaming logic route push every file to the promise array using a for loop and start waiting asynchronously for the renaming process to finish
pendingFiles.push(fsPromises.rename(oldName,newName))
Promise.all(pendingFiles)
.then(values => {
filesStatus = "done"
})
.catch(error => {
filesStatus = "error"
});
filesStatus="pending"
3: Now add a new route /isRenameCompleted that will have a report logic like the following
router.get('/isRenameCompleted', (req, res, next) => {
if (filesStatus==="pending"){
res.end("please wait")
} else if (filesStatus==="done"){
res.end("done! your files renamed")
}
}
var promisePipe = require("promisepipe");
var fs = require("fs");
var crypt = require("crypto");
var // ....
files = ['/mnt/Storage/test.txt', '/mnt/Storage/test2.txt', '/mnt/Storage/test3.txt']
var promises = files.map(function(file_enc) {
return new Promise(function(resolve, reject) {
var file_out = file_enc + '.locked';
promisePipe(
fs.createReadStream(file_enc),
crypt.createCipheriv(alg, genhashsub, iv),
fs.createWriteStream(file_out),
).then(function(streams){
console.log('File written: ' + file_out);
// Promise.resolve(file_out); // tried but doesnt seem to do anything
}, function(err) {
if(err.message.substring(0, 7) === 'EACCES:') {
console.log('Error (file ' + file_out + '): Insufficient rights on file or folder');
} else {
console.log('Error (file ' + file_out + '): ' + err);
}
// Promise.reject(new Error(err)); // tried but doesnt seem to do anything
});
})
});
Promise.all(promises).then(final_function(argument));
I'm trying to encrypt files contained in an array named files.
For the sake of simplicity I added them manually in this example.
What I want to happen:
Create promises array to call with promises.all on completion
Iterate through the array
Create promise for this IO operation
Read file \
Encrypt file -- all done using streams, due to large files (+3GB)
Write file /
On finish write, resolve promise for this IO operation
Run finishing script once all promises have resolved (or rejected)
What happens:
Encryption of first file starts
.then(final_function(argument)) is called
Encryption of first file ends
The files all get encrypted correctly and they can be decrypted afterwards.
Also, errors are displayed, as well as the write confirmations.
I've searched both Stack as well as Google and I have found some similar questions (with answers). But they don't help because many are outdated. They work, until I rewrite them into promises, and then I'm back to where I started.
I could also place 8 different ways to achieve this job, using npm modules or vanilla code, but all of them are also failing in one way or another.
If you already have a promise at your disposal (and promisepipe appears to create a promise), then you generally should not use new Promise(). It looks like your main problem is that you are creating promises that you never resolve.
The other problem is that you are calling final_function in the last line instead of passing a function that will call final_function.
I suggest giving this a try:
var promises = files.map(function(file_enc) {
var file_out = file_enc + '.locked';
return promisePipe(
fs.createReadStream(file_enc),
crypt.createCipheriv(alg, genhashsub, iv),
fs.createWriteStream(file_out),
).then(function(streams){
console.log('File written: ' + file_out);
return file_out;
}, function(err) {
if(err.message.substring(0, 7) === 'EACCES:') {
console.log('Error (file ' + file_out + '): Insufficient rights on file or folder');
} else {
console.log('Error (file ' + file_out + '): ' + err);
}
throw new Error(err);
});
});
Promise.all(promises).then(() => final_function(argument));
an analog of short , file-based process wrapped in a promise all . You could do your encrypt in the 'encrypt' which is wrapped in file handler. encrypt() returns a promise.
segments passes your array of files needing work.
var filHndlr = function(segment){
var uri = segment.uri;
var path = '/tmp/' + uri;
return that.getFile(path)
.then (datatss => {
return that.encrypt(uri, datatss);
});
}
...
Promise.all(segments.map(filHndlr))
.then(resp => { ... });
I am new to node.js and maybe I am doing something wrong.
There's this hugely popular async recursive copy utility npmjs.org/package/ncp.
I am trying to run it in parallel:
var ncp = require('ncp').ncp;
var dirs = [
['test/from/1', 'test/to/1'],
['test/from/2', 'test/to/2'],
['test/from/3', 'test/to/3']
];
var copyDirAsync = function (dir) {
ncp(dir[0], dir[1], function (err) {
console.log('done: ' + dir[1]);
});
}
for (var i = 0; i < dirs.length; ++i) {
copyDirAsync(dirs[i]);
}
So, all dirs copy just fine. However I get only one console.log message printed with a random directory. The other two don't arrive. The program just exists. If I add a 15 sec timeout so that node keeps running for a while, the callbacks don't arrive either. I would assume that this is a problem with ncp, however with 30K downloads per day of a 1-month old realease 0.5.0, and no issues reported so far, plus me a newcomer to node.js, I'll just assume I don't understand something about node.
First read Asynchronous iteration patterns
Now, you can use the async module especially async.series like so;
var ncp = require('ncp').ncp
, async = require('async');
var dirs = [
['test/from/1', 'test/to/1'],
['test/from/2', 'test/to/2'],
['test/from/3', 'test/to/3']
];
var copyDirAsync = function (dir, done) {
ncp(dir[0], dir[1], function (err) {
if (err) return done(err);
console.log('done: ' + dir[1]);
done();
});
}
async.each(dirs, copyDirAsync, function(err){
console.log(err);
});
So Im trying to use the nodejs express FS module to iterate a directory in my app, store each filename in an array, which I can pass to my express view and iterate through the list, but Im struggling to do so. When I do a console.log within the files.forEach function loop, its printing the filename just fine, but as soon as I try to do anything such as:
var myfiles = [];
var fs = require('fs');
fs.readdir('./myfiles/', function (err, files) { if (err) throw err;
files.forEach( function (file) {
myfiles.push(file);
});
});
console.log(myfiles);
it fails, just logs an empty object. So Im not sure exactly what is going on, I think it has to do with callback functions, but if someone could walk me through what Im doing wrong, and why its not working, (and how to make it work), it would be much appreciated.
The myfiles array is empty because the callback hasn't been called before you call console.log().
You'll need to do something like:
var fs = require('fs');
fs.readdir('./myfiles/',function(err,files){
if(err) throw err;
files.forEach(function(file){
// do something with each file HERE!
});
});
// because trying to do something with files here won't work because
// the callback hasn't fired yet.
Remember, everything in node happens at the same time, in the sense that, unless you're doing your processing inside your callbacks, you cannot guarantee asynchronous functions have completed yet.
One way around this problem for you would be to use an EventEmitter:
var fs=require('fs'),
EventEmitter=require('events').EventEmitter,
filesEE=new EventEmitter(),
myfiles=[];
// this event will be called when all files have been added to myfiles
filesEE.on('files_ready',function(){
console.dir(myfiles);
});
// read all files from current directory
fs.readdir('.',function(err,files){
if(err) throw err;
files.forEach(function(file){
myfiles.push(file);
});
filesEE.emit('files_ready'); // trigger files_ready event
});
As several have mentioned, you are using an async method, so you have a nondeterministic execution path.
However, there is an easy way around this. Simply use the Sync version of the method:
var myfiles = [];
var fs = require('fs');
var arrayOfFiles = fs.readdirSync('./myfiles/');
//Yes, the following is not super-smart, but you might want to process the files. This is how:
arrayOfFiles.forEach( function (file) {
myfiles.push(file);
});
console.log(myfiles);
That should work as you want. However, using sync statements is not good, so you should not do it unless it is vitally important for it to be sync.
Read more here: fs.readdirSync
fs.readdir is asynchronous (as with many operations in node.js). This means that the console.log line is going to run before readdir has a chance to call the function passed to it.
You need to either:
Put the console.log line within the callback function given to readdir, i.e:
fs.readdir('./myfiles/', function (err, files) { if (err) throw err;
files.forEach( function (file) {
myfiles.push(file);
});
console.log(myfiles);
});
Or simply perform some action with each file inside the forEach.
I think it has to do with callback functions,
Exactly.
fs.readdir makes an asynchronous request to the file system for that information, and calls the callback at some later time with the results.
So function (err, files) { ... } doesn't run immediately, but console.log(myfiles) does.
At some later point in time, myfiles will contain the desired information.
You should note BTW that files is already an Array, so there is really no point in manually appending each element to some other blank array. If the idea is to put together the results from several calls, then use .concat; if you just want to get the data once, then you can just assign myfiles = files directly.
Overall, you really ought to read up on "Continuation-passing style".
I faced the same problem, and basing on answers given in this post I've solved it with Promises, that seem to be of perfect use in this situation:
router.get('/', (req, res) => {
var viewBag = {}; // It's just my little habit from .NET MVC ;)
var readFiles = new Promise((resolve, reject) => {
fs.readdir('./myfiles/',(err,files) => {
if(err) {
reject(err);
} else {
resolve(files);
}
});
});
// showcase just in case you will need to implement more async operations before route will response
var anotherPromise = new Promise((resolve, reject) => {
doAsyncStuff((err, anotherResult) => {
if(err) {
reject(err);
} else {
resolve(anotherResult);
}
});
});
Promise.all([readFiles, anotherPromise]).then((values) => {
viewBag.files = values[0];
viewBag.otherStuff = values[1];
console.log(viewBag.files); // logs e.g. [ 'file.txt' ]
res.render('your_view', viewBag);
}).catch((errors) => {
res.render('your_view',{errors:errors}); // you can use 'errors' property to render errors in view or implement different error handling schema
});
});
Note: you don't have to push found files into new array because you already get an array from fs.readdir()'c callback. According to node docs:
The callback gets two arguments (err, files) where files is an array
of the names of the files in the directory excluding '.' and '..'.
I belive this is very elegant and handy solution, and most of all - it doesn't require you to bring in and handle new modules to your script.