Node js issue with loop order - node.js

Hello i really need help with this issue, my last console.log is execute BEFORE the for loop and i dont know how to fix it. I really need to have access at my array nbfilm after the for loop
Can someone help me?
What the console print : lien
client.db.query("SELECT name,id FROM film", function (err, result) {
if (err) throw err;
const catalog = new MessageEmbed()
.setTitle("Catalogue")
.setColor("#fcfe80")
.setFooter({text:"🍿 ・ PopFlix"})
let testresult =[]
let nbfilm =[]
for (let compteur of result){
testresult.push(compteur.id)
testresult.push(compteur.name)
}
console.log(testresult)
for (let compteur2 = 0; compteur2 < testresult.length; compteur2+=2){
client.db.query(`SELECT link FROM lien WHERE fid=${testresult[compteur2]}`, function (err,result) {
nbfilm.push(testresult[compteur2+1])
nbfilm.push(result.length)
console.log("nbfilm in for loop",nbfilm)
});
}
console.log("nbfilmAFTER",nbfilm)
});

The body of the loop delays execution. Due to the fact that javascript is an asynchronous i/o type operation language, it is common to return a promise. In other words, the code is executed as expected, but the result of the actions will be visible only after all pending tasks have been completed. In your case, adding async/await in code might help. Use it node docs

It looks like client.db.query is asynchroneous. JS here works as expected because it doesnt wait for the query to be finished before moving to the next line.
Its now a better practice to use async/await instead of callbacks. If you provide the package that you are using we can provide a code example.

client.db.query() is an asynchronous function and won't execute until after the console.log(nbfilm) is executed regardless of how fast that query actually runs.
I'd recommend using Promise.all(). You will also have to "promisify" the query() function. Pass everything you want to resolve(), and then concatenate them in the final line as below.
let promises = [];
for(....) {
promises.push(new Promise((resolve, reject) => {
client.db.query("select ... ", () => {
// all the same stuffs
resolve([testresult[compteru2+1], result.length]);
});
}))
}
Promise.all(promises)
.then((results) => results.reduce((p, c) => p.concat(c), []))
.then(data => console.log(data)); // do whatever you want with "data"
Here's a simplified demo:
https://codesandbox.io/s/promise-all-example-78r347

Related

i need access all axios data after for loop

I'm making a simple word combinatiion website.
and as a final step, I need all possible word in one string
so I write code like this
const fs=require('fs');
const axios=require('axios')
function test(want){
const res=axios.get("http://api.dictionaryapi.dev/api/v2/entries/en/"+want);
const datapromise=res.then((res)=>res.data);
return datapromise
}
fs.readFile('./input.txt','utf-8',function(error,data){
//console.log("console log")
var array=data.toString().split("\n");
fs.writeFile("./log.txt","",(err)=>{});
var res=""
for(i in array){
test(array[i]).then((data)=>(data)=>res+=data[0].word+"<br>").catch(/*(data)=>console.log(data.code)*/);
}
console.log(res);
})
But this code isn't work. console.log(res); is executed first and followed by for loop.
How can I fix it?
Without knowing much about Axios I can tell that axios.get and therefore the test function is going to be async. This means console.log here will always run first here as a result. Test ends up returning a promise that will resolve at a later time.
I'd do something like this (assuming you don't have async/await available):
var res= "";
var promises = [];
for(i in array) {
promises.push(
test(array[i]).then((data) => res+=data[0].word + "<br>")
);
}
Promise.all(promises).finally(() => {
console.log(res);
});
Other notes:
The catch here is being called but nothing is being passed in - this may result in an error
The then has a nested function that I imagine wouldn't ever be called (data) => (data) => this is basically creating a 2nd nested function. I don't think it'd get called.

nodejs Await promises working but log shows different order [duplicate]

I have a function with multiple forEach loops:
async insertKpbDocument(jsonFile) {
jsonFile.doc.annotations.forEach((annotation) => {
annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
return jsonFile;
}
I need to make sure that the async code in the forEach loop calling the this.addVertex function is really done before executing the next one.
But when I log variables, It seems that the this.addRelation function is called before the first loop is really over.
So I tried adding await terms before every loops like so :
await jsonFile.doc.annotations.forEach(async (annotation) => {
await annotation.entities.forEach(async (entity) => {
await this.addVertex(entity);
});
await annotation.relations.forEach(async (relation) => {
await this.addRelation(relation);
});
});
But same behavior.
Maybe it is the log function that have a latency? Any ideas?
As we've discussed, await does not pause a .forEach() loop and does not make the 2nd item of the iteration wait for the first item to be processed. So, if you're really trying to do asynchronous sequencing of items, you can't really accomplish it with a .forEach() loop.
For this type of problem, async/await works really well with a plain for loop because they do pause the execution of the actual for statement to give you sequencing of asynchronous operations which it appears is what you want. Plus, it even works with nested for loops because they are all in the same function scope:
To show you how much simpler this can be using for/of and await, it could be done like this:
async insertKpbDocument(jsonFile) {
for (let annotation of jsonFile.doc.annotations) {
for (let entity of annotation.entities) {
await this.addVertex(entity);
}
for (let relation of annotation.relations) {
await this.addRelation(relation);
}
}
return jsonFile;
}
You get to write synchronous-like code that is actually sequencing asynchronous operations.
If you are really avoiding any for loop, and your real requirement is only that all calls to addVertex() come before any calls to addRelation(), then you can do this where you use .map() instead of .forEach() and you collect an array of promises that you then use Promise.all() to wait on the whole array of promises:
insertKpbDocument(jsonFile) {
return Promise.all(jsonFile.doc.annotations.map(async annotation => {
await Promise.all(annotation.entities.map(entity => this.addVertex(entity)));
await Promise.all(annotation.relations.map(relation => this.addRelation(relation)));
})).then(() => jsonFile);
}
To fully understand how this works, this runs all addVertex() calls in parallel for one annotation, waits for them all to finish, then runs all the addRelation() calls in parallel for one annotation, then waits for them all to finish. It runs all the annotations themselves in parallel. So, this isn't very much actual sequencing except within an annotation, but you accepted an answer that has this same sequencing and said it works so I show a little simpler version of this for completeness.
If you really need to sequence each individual addVertex() call so you don't call the next one until the previous one is done and you're still not going to use a for loop, then you can use the .reduce() promise pattern put into a helper function to manually sequence asynchronous access to an array:
// helper function to sequence asynchronous iteration of an array
// fn returns a promise and is passed an array item as an argument
function sequence(array, fn) {
return array.reduce((p, item) => {
return p.then(() => {
return fn(item);
});
}, Promise.resolve());
}
insertKpbDocument(jsonFile) {
return sequence(jsonFile.doc.annotations, async (annotation) => {
await sequence(annotation.entities, entity => this.addVertex(entity));
await sequence(annotation.relations, relation => this.addRelation(relation));
}).then(() => jsonFile);
}
This will completely sequence everything. It will do this type of order:
addVertex(annotation1)
addRelation(relation1);
addVertex(annotation2)
addRelation(relation2);
....
addVertex(annotationN);
addRelation(relationN);
where it waits for each operation to finish before going onto the next one.
foreach will return void so awaiting it will not do much. You can use map to return all the promises you create now in the forEach, and use Promise.all to await all:
async insertKpbDocument(jsonFile: { doc: { annotations: Array<{ entities: Array<{}>, relations: Array<{}> }> } }) {
await Promise.all(jsonFile.doc.annotations.map(async(annotation) => {
await Promise.all(annotation.entities.map(async (entity) => {
await this.addVertex(entity);
}));
await Promise.all(annotation.relations.map(async (relation) => {
await this.addRelation(relation);
}));
}));
return jsonFile;
}
I understand you can run all the addVertex concurrently. Combining reduce with map splitted into two different set of promises you can do it. My idea:
const first = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.entities.map(this.addVertex));
return acc;
}, []);
await Promise.all(first);
const second = jsonFile.doc.annotations.reduce((acc, annotation) => {
acc = acc.concat(annotation.relations.map(this.addRelation));
return acc;
}, []);
await Promise.all(second);
You have more loops, but it does what you need I think
forEach executes the callback against each element in the array and does not wait for anything. Using await is basically sugar for writing promise.then() and nesting everything that follows in the then() callback. But forEach doesn't return a promise, so await arr.forEach() is meaningless. The only reason it isn't a compile error is because the async/await spec says you can await anything, and if it isn't a promise you just get its value... forEach just gives you void.
If you want something to happen in sequence you can await in a for loop:
for (let i = 0; i < jsonFile.doc.annotations.length; i++) {
const annotation = jsonFile.doc.annotations[i];
for (let j = 0; j < annotation.entities.length; j++) {
const entity = annotation.entities[j];
await this.addVertex(entity);
});
// code here executes after all vertix have been added in order
Edit: While typing this a couple other answers and comments happened... you don't want to use a for loop, you can use Promise.all but there's still maybe some confusion, so I'll leave the above explanation in case it helps.
async/await does not within forEach.
A simple solution: Replace .forEach() with for(.. of ..) instead.
Details in this similar question.
If no-iterator linting rule is enabled, you will get a linting warning/error for using for(.. of ..). There are lots of discussion/opinions on this topic.
IMHO, this is a scenario where we can suppress the warning with eslint-disable-next-line or for the method/class.
Example:
const insertKpbDocument = async (jsonFile) => {
// eslint-disable-next-line no-iterator
for (let entity of annotation.entities) {
await this.addVertex(entity)
}
// eslint-disable-next-line no-iterator
for (let relation of annotation.relations) {
await this.addRelation(relation)
}
return jsonFile
}
The code is very readable and works as expected. To get similar functionality with .forEach(), we need some promises/observables acrobatics that i think is a waste of effort.

How to 'order' your callbacks without using a blocking structure

I'm new to node.js and Stack Overflow and I'm having a little trouble: I want to read two files and do something with them in a specific order. Problem being that I don't know which one will finish being read first so I don't know how to make sure they will trigger in the right order.
To give an example let's say that I want to read two files and write them in the response:
fs.readFile('./file1',(err,data) => {
res.write(data);
})
fs.readFile('./file2',(err,data) => {
res.write(data);
})
How could I make sure the first file will be written before the second even if the second file is smaller than the first one?
I could do that:
fs.readFile('./file1',(err,data) => {
res.write(data);
fs.readFile('./file2',(err,data) => {
res.write(data);
})
})
But it would act like a blocking structure: the second one couldn't start being read before the end of the first one and that's not the point of Node.js... Am I right?
P.S. Sorry if the question is dumb or for my poor English
One of the reason (out of several) Promise and async/await exists is to solve exactly this kind of issue.
You can create a wrapper to return Promise:
function readSomething(filePath) {
return new Promise((resolve, reject) => {
fs.readFile(filePath, (err, data) => {
if (err) reject(err);
else resolve(data);
});
}
Then you can call like,
// called in parallel
const result1 = readSomething('./file1');
const result2 = readSomething('./file2');
result1.then((data) => {
res.write(data) // or anything
// to order, chaining promise
return result2;
})
.then(data => res.write(data))
With async/await you can do (await is valid only inside async function):
async function handleResults() {
res.write(await result1)
res.write(await result2)
}
Note: Watch out for error (reject cases). They become a little tricky in this situation. (I'm you can figure it out)
In nodejs you can utilise the power of EventEmitter as well. You can also check out reactive programming implemented by rxjs library for Observer pattern.
let file1 = fs.readFile('./file1')
let file2 = fs.readFile('./file2');
file1 = await file1;
file2 = await file2;
res.write(file1)
res.write(file2)
By this reading takes place in paralle, but if you know before hand that file1 is going to be comparatively big you can swap the await statements. And then you can write in whichever order you want to.

Promise map (node.js) with large list seems to stop iterating after certain index

I'm using the bluebird Promise library. From this library I'm using Promise.mapSeries() to perform a mapping operation on a very large list (size=747357).
The code looks as follows (psuedo):
function myFunc(data) {
return Promise.mapSeries(data, handler)
.then((data) => {
console.log('Success!');
return Promise.resolve(data);
})
.catch(console.log);
}
In the handler, I'm doing a couple of things:
Running other Promise functions (recursive)
Returning a new data structure to be stored in the resulting array
In the handler function, I add a console.log('Iter: ', i);.
So then this returns the iter # for each item that's been mapped. It slows and eventually stops at #288. Does this reflect some sort of limit I'm hitting?
I don't understand what the problem is - logic says perhaps this list is too large to handle with Promise.mapSeries.
Any advice or solutions would be appreciated.
Thanks in advance.
UPDATE:
Here's a snippet of the handler function:
function handler(v, i, len) {
return new Promise((resolve, reject) => {
promise
.then((data) => {
return recursivePromiseFn(data)
})
.then((data) => {
let _data = transformData(data);
statusLogger(i);
resolve(_data);
})
})
}
The problem was not with Promise.map or Promise.mapSeries. The problem was an infinite loop in one of my promise functions. This loop seemed to prevent the promise loop from continuing.
A noteworthy comment was that you should avoid using the Promise anti-pattern. This is true but was not the cause of the problem.

for loop on promise don't follow the good order of output

I am trying to do a for loop in a promise but unfortunatly the output that comes out it is not what i am expecting:
My code
var ahaha = function mytestfunction(numb){
return new Promise(function(resolve, reject) {
console.log(numb);
return resolve('test');
})
.then(function(x) {
z='firststep' + x ;
console.log(z);
return z;
})
.then(function(z) {
console.log(z + 'secondstep');
return z
})
.then(function(z) {
console.log(z + 'thirdstep')
});
};
var promises = [];
for (var i = 1; i <= 2; i++) {
promises.push(ahaha(i));
}
Promise.all(promises)
.then(function(data){ console.log("everything is finished")});
What it returns is :
1
2
firststeptest
firststeptest
firststeptestsecondstep
firststeptestsecondstep
firststeptestthirdstep
firststeptestthirdstep
everything is finished
But i want it to return
1
firststeptest
firststeptestsecondstep
firststeptestthirdstep
2
firststeptest
firststeptestsecondstep
firststeptestthirdstep
everything is finished
I don't understand why the promises are not chained one after the other one.
Note that i succeed doing this operation using async.waterfall but i also want to know how to do it with promises.
Thanks a lot
Promise.all() is purposely used when you want things to run in parallel and it will tell you when all are done and they may each finish in any order.
There are lots of different ways to sequence things using promises. If you just have two function calls like your code shows, you can just do them manually:
ahaha(1).then(result => ahaha(2)).then(data => {
console.log("everything finished");
});
Or, a common pattern using .reduce():
[1,2].reduce(p, val => {
return p.then(() => ahaha(val));
}, Promise.resolve()).then(data => {
// everything done here
});
Or, my favorite using the Bluebird promise library:
Promise.mapSeries([1,2], ahaha).then(result => {
// everything done here
});
There are many other schemes, which you can see in these other answers:
How to synchronize a sequence of promises?
JavaScript: Perform a chain of promises synchronously
ES6 Promises - something like async.each?
How can I execute shell commands in sequence?
Can Promise load multi urls in order?
Promise.all is used to run the promises in parallel, not sequentially.
Using the popular BlueBird library, you could have used reduce but there's no equivalent function in standard ES6 but you can do this:
promises.reduce(
(p, next) => p.then(next),
Promise.resolve()
).then(data=>{ console.log("everything is finished")});

Resources