Is fs_promises_readdir an async function? - node.js

Basically, what I'am trying to this is to read all the files names from a directory and I found this fsPromises.readdir which does pretty much what I want :
Returns with an array of the names of the files in the directory.
I have taken this code from nodeJS documentation page , here :
https://nodejs.org/api/fs.html#fs_fspromises_readdir_path_options
import { readdir } from 'fs/promises'; // I changed this to : const readdir = require('fs/promises');
try {
const files = await readdir(path);
for await (const file of files)
console.log(file);
} catch (err) {
console.error(err);
}
When I run this , the console will give me the following error :
const files = await readdir(path);
^^^^^
SyntaxError: await is only valid in async function
This is weird since according to the documentation this function is a Promise and if it is a promise that means it is async. Right ?
So , I dont know what am I missing ?

The error you're seeing refers to the fact that you're using the await keyword outside of an async function.
I think this is allowed in recent versions of Node, but if you're running an older version you will need to wrap your code in an async function.
(async function() {
try {
const files = await readdir(path);
for await (const file of files)
console.log(file);
} catch (err) {
console.error(err);
})()
}()

Related

Pending promise in aws lambda function

I'm using the library JSZip to unzip a file and I am able to get each of the file name. However, what I want to do is to take the content of the file to do some processing. I know that you are able to do this by doing zip.file(name).async(type).then() according to the API.
For some reason though, when I do that in my lambda function, it doesn't hit that function at all. I tried to do return that line of code, but I just got Promise <pending>... I tried to wrap it with a try/catch and that didn't do anything. I did an async/await but that didn't work either. A callback like:
zip.file(name).async('blob').then(function(content) {
//Do processing in here
});
doesnt seem to work either. What do I need to do to get the content of the specific file? Nothing I have done is work and I think it has to do with the promise pending. I'm stuck at this point and have no idea what to do.. Any help is greatly appreciated. For reference here is my code for how I'm doing it:
let zip = new JSZip();
zip.loadAsync(file).then(function(contents) {
Object.keys(contents.files).forEach(function(name) {
zip.file(name).async('blob').then(function(content) {
// Processing here
});
});
});
EDIT
Full code:
index.js
const manager = require("./manager.js");
exports.handler = async (event, context, callback) => {
return manager.manager(event, callback);
};
manager.js
exports.manager = async function (event, callback) {
const path = '/temp/' + fileName;
const file = fs.readFileSync(path);
let zip = new JSZip();
zip.loadAsync(file).then(function(contents) {
Object.keys(contents.files).forEach(function(name) {
zip.file(name).async('blob').then(function(content) {
// Processing here
});
});
});
}
Change to have a return for the function:
return zip.loadAsync(file).then(...)

How to make fs.readFile async await?

I have this nodejs code here which read a folder and process the file. The code works. But it is still printing all the file name first, then only read the file. How do I get a file and then read a content of the file first and not getting all the files first?
async function readingDirectory(directory) {
try {
fileNames = await fs.readdir(directory);
fileNames.map(file => {
const absolutePath = path.resolve(folder, file);
log(absolutePath);
fs.readFile(absolutePath, (err, data) => {
log(data); // How to make it async await here?
});
});
} catch {
console.log('Directory Reading Error');
}
}
readingDirectory(folder);
To use await, you need to use promise versions of fs.readFile() and fs.readdir() which you can get on fs.promises and if you want these to run sequentially, then use a for loop instead of .map():
async function readingDirectory(directory) {
const fileNames = await fs.promises.readdir(directory);
for (let file of fileNames) {
const absolutePath = path.join(directory, file);
log(absolutePath);
const data = await fs.promises.readFile(absolutePath);
log(data);
}
}
readingDirectory(folder).then(() => {
log("all done");
}).catch(err => {
log(err);
});

Mongoose Function Cannot be Called in different file in Node.Js

I created some functions containing MongoDB methods in one File. It works well when I access the function from the same file, but when I am trying to access the function from another file, it doesn't work.
Here is the code
const Chain = require('../database/models/chains')
const getlatestChain = async () => {
try {
const thechains = await Chain.countDocuments()
if (thechains < 2) {
throw new Error('there is only one chain!')
}
return thechains
} catch (error) {
return error
}
}
module.exports = {
getlatestChain: getlatestChain
}
It doesn't work when I call it from another file
const thechain = require('../src/utils/chain')
require('../src/database/database')
thechain.getlatestChain()
.then((result) => {
console.log('how many documents : ' + result)
}).catch((err) => {
console.log(err)
});
error
TypeError: Chain.countDocuments is not a function
check the chains model to make sure you are exporting the countDocuments function, check the spelling as well if it is exported

Problem with async when downloading a series of files with nodejs

I'm trying to download a bunch of files. Let's say 1.jpg, 2.jpg, 3.jpg and so on. If 1.jpg exist, then I want to try and download 2.jpg. And if that exist I will try the next, and so on.
But the current "getFile" returns a promise, so I can't loop through it. I thought I had solved it by adding await in front of the http.get method. But it looks like it doesn't wait for the callback method to finish. Is there a more elegant way to solve this than to wrap the whole thing in a new async method?
// this returns a promise
var result = getFile(url, fileToDownload);
const getFile = async (url, saveName) => {
try {
const file = fs.createWriteStream(saveName);
const request = await http.get(url, function(response) {
const { statusCode } = response;
if (statusCode === 200) {
response.pipe(file);
return true;
}
else
return false;
});
} catch (e) {
console.log(e);
return false;
}
}
I don't think your getFile method is returning promise and also there is no point of awaiting a callback. You should split functionality in to two parts
- get file - which gets the file
- saving file which saves the file if get file returns something.
try the code like this
const getFile = url => {
return new Promise((resolve, reject) => {
http.get(url, response => {
const {statusCode} = response;
if (statusCode === 200) {
resolve(response);
}
reject(null);
});
});
};
async function save(saveName) {
const result = await getFile(url);
if (result) {
const file = fs.createWriteStream(saveName);
response.pipe(file);
}
}
What you are trying to do is getting / requesting images in some sync fashion.
Possible solutions :
You know the exact number of images you want to get, then go ahead with "request" or "http" module and use promoise chain.
You do not how the exact number of images, but will stop at image no. N-1 if N not found. then go ahed with sync-request module.
your getFile does return a promise, but only because it has async keyword before it, and it's not a kind of promise you want. http.get uses old callback style handling, luckily it's easy to convert it to Promise to suit your needs
const tryToGetFile = (url, saveName) => {
return new Promise((resolve) => {
http.get(url, response => {
if (response.statusCode === 200) {
const stream = fs.createWriteStream(saveName)
response.pipe(stream)
resolve(true);
} else {
// usually it is better to reject promise and propagate errors further
// but the function is called tryToGetFile as it expects that some file will not be available
// and this is not an error. Simply resolve to false
resolve(false);
}
})
})
}
const fileUrls = [
'somesite.file1.jpg',
'somesite.file2.jpg',
'somesite.file3.jpg',
'somesite.file4.jpg',
]
const downloadInSequence = async () => {
// using for..of instead of forEach to be able to pause
// downloadInSequence function execution while getting file
// can also use classic for
for (const fileUrl of fileUrls) {
const success = await tryToGetFile('http://' + fileUrl, fileUrl)
if (!success) {
// file with this name wasn't found
return;
}
}
}
This is a basic setup to show how to wrap http.get in a Promise and run it in sequence. Add error handling wherever you want. Also it's worth noting that it will proceed to the next file as soon as it has received a 200 status code and started downloading it rather than waiting for a full download before proceeding

NodeJS copyFile then unlink in callback

I have a large amount of files (millions) that I want to copy from a folder to an other one, but then based on some parsing options, I will need to delete the file if it doesn't respect some criteria.
I'm doing the parsing on my local copy of the file because it would be slower on the network to parse (read file + parse) than doing it locally.
My code looks like this :
for (let file of files) {
fs.copyFile(from, to, err => {
if (err) return;
parse(file);
});
}
The parse function is something like :
parse = file => {
fs.readFile(file, (err, data) => {
//do some parsing
if (notOk) {
fs.unlink(file);
};
}
}
The problem is that it's doing all the copyFile and doesn't seem to execute the callback with the unlink, and I really need to unlink as soon as the file is finished copying since I can't afford the disk space to copy all files first.
Do I need to use the Sync version of those methods or something else?
Thanks
I would have expected an output like this :
copyFile a
copyFile b
copyFile c
parsing a
copyFile d
unlink a
parsing b
copyFile e
...
but instead I have all copyFile and none of the parsing/unlink happening.
If files contains millions of files, you're starting millions of concurrent copy actions, which is probably overwhelming your system.
Instead, consider using something like async.eachLimit to limit the number of concurrent copy actions.
Untested example:
const async = require('async');
async.eachLimit(files, 10, (file, callback) => {
fs.copyFile(from, to, err => {
if (err) return callback(); // or pass `err`
parse(file, callback);
});
});
function parse(file, callback) {
fs.readFile(file, (err, data) => {
//do some parsing
if (notOk) {
fs.unlink(file, callback);
} else {
return callback();
}
})
}
I think this is happening, the callback in the coping is finishing so the thread is finished, but the functions can not finish, you can make this sync and is going to work fine and be more clear, but other way is to transform the functions to promises so you can return the promise and wait for them.
You could write something like this.
const util = require('util');
const fs = require('fs');
const files = ['1.txt', '2.txt', '3.txt', '4.txt'];
//promisify all the fs functions to use.
const copyFilePromise = util.promisify(fs.copyFile);
const readFilePromise = util.promisify(fs.readFile);
const unlinkPromise = util.promisify(fs.unlink);
//make a parser function that return a promise.
const parse = (file) => {
return readFilePromise(file)
.then(() => {
//do some parsing
if(notOk)
return unlinkPromise(file)
});
};
//store all the promises in an array
let promises = []
//iterate for all the files using the promisify aproach of the functions
for (let file of files){
//add all the promises to an array of promises.
promises.push(copyFilePromise(from, to).then(parse(file)).catch(console.error));
}
//wait for all the promises to resolve.
Promise.all(promises);

Resources