Using multiple awaits within a Promise that are dependent on each other - node.js

Before describing the issue in a very complex way, I would like to know how to execute multiple dependent awaits in a return Promise one after another without getting new data in my return Promise block. In other words, I just want my try-block to be executed as one statement.
const handler = (payload) => {
return new Promise(async (resolve, reject) => {
try {
const exists = await getRedis(payload)
if(exists === null) {
await setRedis(payload)
await write2Mongo(payload)
resolve()
} else {
resolve()
}
} catch (err) {
reject(err)
}
});
};
In concrete terms, it's about RabbitMQ ("amqplib": "^0.8.0"), where the payloads fly in. These I want to check first if they are known by the system. If not, I want to set them in Redis ("async-redis": "^2.0.0") and then write them to MongoDB ("mongoose": "^6.0.9"). Since I get a lot of messages from RabbitMQ, it works fine at first and then I get a "duplicate key error" from Mongo. This is because my first getRedis returns a null. While writing the data into Redis and MongoDB, a second message comes into my block and gets a "null" value from getRedis, because the message was not yet set via setRedis.
As I read, this is an antipattern with bad error handling. But the corresponding posts have unfortunately not solved my problem.
Can you please help me.

in senario that you describe, you want a queue that you can process it in series
let payloads = [];
const handler = payload => payloads.push(payload);
;(async function insertDistincPayloads() {
for (let i=0; i < payloads.length; i++) {
const exists = await getRedis(payload)
if(exists === null) {
await setRedis(payload)
await write2Mongo(payload)
}
}
payloads = []
setTimeout(insertDistincPayloads, 100); // loop continuously with a small delay
})();
sorry for my bad english :-)

Related

NodeJS: promiseAll getting stuck

One of the promises is not being resolved, what can be the issue possibly?
const items = await Promise.all(data.map(async i => {
const tokenUri = await contract.tokenURI(i).catch(function (error) {return;});
if (tokenUri.length < 8) {
return;
}
let url = "http://127.0.0.1:5001/api/v0/get?arg=" + tokenUri.slice(7)
const meta = await axios.post(url).catch(function (error) {return;});
success++;
}), function (err) {
callback(err);
})
This part of the code is misplaced:
, function (err) {
callback(err);
})
Currently, it is being passed as a second argument to Promise.all(). Promise.all() only takes one argument (an iterable like an array) so that's being ignored and that callback will NEVER get called.
If you meant for that to be like the second argument to .then(), then you'd have to have a .then() to use it with, but you're trying to use it as second argument to Promise.all() and that is not correct.
If you want to know when Promise.all() is done, you would do this:
try {
const items = await Promise.all(data.map(async i => {
const tokenUri = await contract.tokenURI(i).catch(function(error) { return; });
if (tokenUri.length < 8) {
return;
}
let url = "http://127.0.0.1:5001/api/v0/get?arg=" + tokenUri.slice(7)
const meta = await axios.post(url).catch(function(error) { return; });
success++;
}));
console.log("Promise.all() is done", items);
} catch (e) {
console.log(e);
}
Note, you are silently "eating" errors with no logging of the error inside this loop which is pretty much never a good idea.
If, as you propose, one of your promises is actually never being resolved/rejected, then none of this code will do anything to fix that and you will have to go down to the core operation and find out why it's not resolving/rejecting on its own. Or, configure a timeout for it (either in the API or by manually creating your own timeout).

How to make a function wait for data to appear in the DB? NodeJS

I am facing a peculiar situation.
I have a backend system (nodejs) which is being called by FE (pretty standard :) ). This endpoint (nodejs) needs to call another system (external) and get the data it produces and return them to the FE. Until now it all might seem pretty usual but here comes the catch.
The external system has async processing and therefore responds to my request immediately but is still processing data (saves them in a DB) and I have to get those data from DB and return them to the FE.
And here goes the question: what is the best (efficient) way of doing it? It usually takes a couple of seconds only and I am very hesitant of making a loop inside the function and for the data to appear in the DB.
Another way would be to have the external system call an endpoint at the end of the processing (if possible - would need to check that with the partner) and wait in the original function until that endpoint is called (not sure exactly how to implement that - so if there is any documentation, article, tutorial, ... would appreciate it very much if you could share guys)
thx for the ideas!
I can give you an example that checks the Database and waits for a while if it can't find a record. And I made a fake database connection for example to work.
// Mocking starts
ObjectID = () => {};
const db = {
collection: {
find: () => {
return new Promise((resolve, reject) => {
// Mock like no record found
setTimeout(() => { console.log('No record found!'); resolve(false) }, 1500);
});
}
}
}
// Mocking ends
const STANDBY_TIME = 1000; // 1 sec
const RETRY = 5; // Retry 5 times
const test = async () => {
let haveFound = false;
let i = 0;
while (i < RETRY && !haveFound) {
// Check the database
haveFound = await checkDb();
// If no record found, increment the loop count
i++
}
}
const checkDb = () => {
return new Promise((resolve) => {
setTimeout(async () => {
record = await db.collection.find({ _id: ObjectID("12345") });
// Check whether you've found or not the record
if (record) return resolve(true);
resolve(false);
}, STANDBY_TIME);
});
}
test();

Synchronously iterate through firestore collection

I have a firebase callable function that does some batch processing on documents in a collection.
The steps are
Copy document to a separate collection, archive it
Run http request to third party service based on data in document
If 2 was successful, delete document
I'm having trouble with forcing the code to run synchronously. I can't figure out the correct await syntax.
async function archiveOrders (myCollection: string) {
//get documents in array for iterating
const currentOrders = [];
console.log('getting current orders');
await db.collection(myCollection).get().then(querySnapshot => {
querySnapshot.forEach(doc => {
currentOrders.push(doc.data());
});
});
console.log(currentOrders);
//copy Orders
currentOrders.forEach (async (doc) => {
if (something about doc data is true ) {
let id = "";
id = doc.id.toString();
await db.collection(myCollection).doc(id).set(doc);
console.log('this was copied: ' + id, doc);
}
});
}
To solve the problem I made a separate function call which returns a promise that I can await for.
I also leveraged the QuerySnapshot which returns an array of all the documents in this QuerySnapshot. See here for usage.
// from inside cloud function
// using firebase node.js admin sdk
const current_orders = await db.collection("currentOrders").get();
for (let index = 0; index < current_orders.docs.length; index++) {
const order = current_orders.docs[index];
await archive(order);
}
async function archive(doc) {
let docData = await doc.data();
if (conditional logic....) {
try {
// await make third party api request
await db.collection("currentOrders").doc(id).delete();
}
catch (err) {
console.log(err)
}
} //end if
} //end archive
Now i'm not familiar with firebase so you will have to tell me if there is something wrong with how i access the data.
You can use await Promise.all() to wait for all promises to resolve before you continue the execution of the function, Promise.all() will fire all requests simultaneously and will not wait for one to finish before firing the next one.
Also although the syntax of async/await looks synchronous, things still happen asynchronously
async function archiveOrders(myCollection: string) {
console.log('getting current orders')
const querySnapshot = await db.collection(myCollection).get()
const currentOrders = querySnapshot.docs.map(doc => doc.data())
console.log(currentOrders)
await Promise.all(currentOrders.map((doc) => {
if (something something) {
return db.collection(myCollection).doc(doc.id.toString()).set(doc)
}
}))
console.log('copied orders')
}

Accessing firestore data at a global scope

I'm trying to access the firestore data and push it to an array. This is super basic but for some reason I cannot figure out why this isn't working:
var db = admin.firestore();
let arr = [];
var Ref = db.collection('Test').doc('Document');
var getDoc = Ref.get()
.then(doc => {
if (!doc.exists) {
console.log('No such document!');
} else {
let data = doc.data().Name;
arr.push(data);
}
})
.catch(err => {
console.log('Error getting document', err);
});
console.log(arr) // expecting >>> ['Joe'] (ie: data that is in firestore object)
Why doesn't arr contain the firestore object?
Thanks for the help.
It doesn't contain the data from firestore yet.
The get() operates asynchronously by returning a Promise and then continuing program execution. The next line is the console.log(arr), but arr isn't populated yet - it is populated when the Promise completes (calls the then() part).
If you're using a sufficiently modern version of node (node 8 and up, which you should be using at this point), then you can use await to wait for the asynchronous operation to complete before continuing to the next line.
I haven't tested it, but your code might look something like this after a rewrite:
doc = await Ref.get();
if (!doc.exists) {
console.log('No such document!');
} else {
let data = doc.data().Name;
arr.push(data);
}
console.log(arr)
This will work because the await waits for the async get() to complete and return the doc. The rest of it is processed synchronously.

Handling exceptions within recursive promise

I'm trying to both be able to handle a paginated API, as well as do retries if throttled for too many requests. The pagination is handled by recursing if 'nextToken' is present in the response object. I'm hoping to be able to catching a Throttling Exception, and effectively start the whole request over by recursing without passing the token. This is my current code:
function getAllExecHist(execArn) {
var sfn = new AWS.StepFunctions();
sfn = Promise.promisifyAll(sfn);
execHists = [];
return new Promise(function(resolve, reject) {
function getExecHist(nextToken) {
params = {};
params.executionArn = execArn;
if (nextToken !== undefined) {
params.nextToken = nextToken;
}
sfn.getExecutionHistoryAsync(params)
.then(function(results) {
execHists = execHists.concat(results.events);
if (!results.nextToken) {
resolve(execHists);
}
else {
getExecHist(results.nextToken);
}
})
.catch(function(e) {
console.log('caught this: ', e);
console.log('retrying');
return new Promise(function(res, rej) {
console.log('Sleeping');
setTimeout(function() {
execHists = [];
res(getExecHist());
}, random(100,10000));
});
})
}
getExecHist();
});
}
The recursion was handling pagination without issue, but since adding the catch, it simply never returns. Any ideas what I'm doing wrong / how to fix?
The AWS SDK supports promises and you can configure Bluebird as it's promise library.
const Promise = require('bluebird');
const AWS = require('aws');
AWS.config.setPromisesDependency(Promise);
const sfn = new AWS.StepFunctions();
Use Promise.delay() instead of setTimeout.
Try and avoid creating new promises if functions are already returning them. Only wrap a promise in new Promise if you have a lot of synchronous code that might throw an error or needs to resolve the promise early.
The following also avoids the extra function and nested scope by passing values between function calls.
function getExecHist(execArn, execHists, nextToken) {
let params = {};
params.executionArn = execArn;
if ( nextToken !== undefined ) params.nextToken = nextToken;
if ( execHists === undefined ) execHists = [];
return sfn.getExecutionHistory(params).promise()
.then(results => {
execHists = execHists.concat(results.events);
if (!results.nextToken) return execHists;
return getExecHist(execArn, execHists, results.nextToken);
})
.catch(e => {
console.log('caught this: ', e);
console.log('retrying');
return Promise.delay(random(100,10000))
.then(() => getExecHist(execArn));
})
}
Eventually you should be specific about what errors you retry on and include a count or time limit too.
Also note that this is the wrong way to retry a rate limit issue as this starts again from the beginning. A rate limit retry should continue from where it left off, otherwise you are just adding to your rate limit problems.

Resources