postsArr does not get data
router.get('/user-post/:id', checkJwt, (req, res, next) => {
let postsArr = []
db.userSchema.findOne({ _id: req.params.id })
.populate('posts')
.exec((err, da) => {
for (let i = 0; i < da.posts.length; i++) {
db.postSchema.find({ _id: da.posts[i]._id })
.populate('comments')
.exec((err, post) => {
postsArr.push(post)
})
}
console.log(postsArr)
})
})
This is a whole lot easier if you use the promise interface on your database:
router.get('/user-post/:id', checkJwt, async (req, res, next) => {
try {
let da = await db.userSchema.findOne({ _id: req.params.id }).populate('posts').exec();
let postsArray = await Promise.all(da.posts.map(post => {
return db.postSchema.find({ _id: post._id }).populate('comments').exec();
}));
res.json(postsArray);
} catch (e) {
console.log(e);
res.sendStatus(500):
}
});
The challenge with an asynchronous operation in a loop is that they don't run sequentially - they all run in parallel. The for loop just starts all your asynchronous operations and then you never know when they are all done unless you track them all somehow. That can be done without promises by using counters to keep track of when every single asynchronous result is done, but it's a whole lot easier to just let Promise.all() do that for you. It will also put all the results in the right order for you too.
If you wanted to sequence the database operations and run them serially one at a time, you could do this:
router.get('/user-post/:id', checkJwt, async (req, res, next) => {
try {
let da = await db.userSchema.findOne({ _id: req.params.id }).populate('posts').exec();
let postsArray = [];
for (let post of da.posts) {
let result = await db.postSchema.find({ _id: post._id }).populate('comments').exec();
postsArray.push(result);
}
res.json(postsArray);
} catch (e) {
console.log(e);
res.sendStatus(500):
}
});
This second version runs only one database operation at a time, sequentially. It will put less peak load on the database, but likely be slower to finish.
You will notice that the use of promises and await makes the error handling much simpler too as all errors will propagate to the same try/catch where you can log the error and send an error response. Your original code did not have error handling on your DB calls.
I have a fairly straightforward CRUD app which renders the results of two queries onto one page. The problem that arose once I got this to "work" was that the page required a refresh in order to display the results. On first load, no results were displayed.
I came to figure out that this is a problem/symptom of Node's asynchronous nature. I've been trying to approach this problem by using async/await, and from hours of messing with things, I feel like I'm quite close to the solution, but it's just not working out - I still need a manual refresh to display/render the results on the .ejs page.
The code:
var entries = [];
var frontPageGoals = [];
app.get('/entries', async (req,res) => {
if (req.session.password) {
const entriesColl = await
db.collection('entries')
.find()
.sort({date: -1})
.toArray((err, result) => {
if (err) { console.log(err) }
else {
for (i=0; i<result.length; i++) {
entries[i] = result[i];
}
}
});
const goalsColl = await
db.collection('goals')
.find()
.toArray((err, result) => {
if (err) {console.log(err)}
else {
for (i=0; i<result.length; i++) {
frontPageGoals[i] = result[i];
}
}
});
res.render('index.ejs', {entries: entries, frontPageGoals: frontPageGoals});
}
else {
res.redirect('/');
}
});
Now, I can conceive of a few problems here, but honestly I'm just at my wits end trying to figure this out. For example, I'm sure it's problematic that the empty lists which will contain the results to be passed when the page renders are outside the actual async function. But after trying to move them a dozen different places within the async area... still no dice.
Any help would be hugely appreciated! This is basically the last big "thing" I need done for this app.
I'm not 100% sure about your database driver, but assuming that the toArray() returns a promise (which it does in the default mongodb driver), the await will actually return the value you expect in your callback, result in your case, or in case there was an error, which you expected it as err in your callback, it will be thrown, thus forcing you to use try-catch blocks, in your case, you would just use console.log(err) in the catch block, since you aren't doing any handling
Here's your code after updating :
app.get("/entries", async (req, res) => {
if (req.session.password) {
try {
const entries = await db
.collection("entries")
.find()
.sort({ date: -1 })
.toArray();
const frontPageGoals = await db
.collection("goals")
.find()
.toArray();
res.render("index.ejs", {
entries: entries,
frontPageGoals: frontPageGoals
});
} catch (err) {
console.log(err);
}
} else {
res.redirect("/");
}
});
EDIT
However, if you don't know about promises -which async/await are basically promises-, and wanna just do it using callbacks -not advised-, you would have to just send your response in the callback, and nest the 2nd query in the first query's callback, here is the code,, with some comments to hopefully help you out:
app.get("/entries", (req, res) => {
if (req.session.password) {
// First query
db.collection("entries")
.find()
.sort({ date: -1 })
.toArray((err, entryResult) => {
if (err) {
console.log(err);
} else {
// In the callback of the first query, so it will
// execute 2nd query, only when the first one is done
db.collection("goals")
.find()
.toArray((err, frontPageResult) => {
if (err) {
console.log(err);
} else {
// In the callback of the 2nd query, send the response
// here since both data are at hand
res.render("index.ejs", {
entries: entryResult,
frontPageGoals: frontPageResult
});
}
});
}
});
} else {
res.redirect("/");
}
});
I have removed the async keyword since you no longer need it
I renamed the callback arguments, instead of just result, because both callbacks would have the same argument name, and you would have had to store it in a temp variable
Consider a promise-chained chunk of code for example:
return Promise.resolve()
.then(function () {
return createSomeData(...);
})
.then(function () {
return updateSomeData(...);
})
.then(function () {
return deleteSomeData(...);
})
.catch(function (error) {
return ohFishPerformRollbacks();
})
.then(function () {
return Promise.reject('something failed somewhere');
})
In the above code, let's say something went wrong in the function updateSomeData(...). Then one would have to revert the create operation that was executed before this.
In another case, if something went wrong in the function deleteSomeData(...), then one would want to revert the operations executed in createSomeData(...) and updateSomeData(...).
This would continue as long as all the blocks have some revert operations defined for themselves in case anything goes wrong.
Only if there was a way in either NodeJs or the database or somewhere in the middle, that would revert all the transactions happening under the same block of code.
One way I can think of this to happen is by flagging all the rows in database with a transactionId (ObjectID) and a wasTransactionSuccessful(boolean), so that CRUD operations could be clubbed together with their transactionIds, and in case something goes wrong, those transactions could be simply deleted from the database in the ending catch block.
I read about rolling back transactions in https://docs.mongodb.com/manual/tutorial/perform-two-phase-commits/. But I want to see if it can be done in a more simpler fashion and in a generic manner for NoSQL databases to adapt.
I am not sure if this would satisfy your use case, but I hope it would.
let indexArray = [1, 2, 3];
let promiseArray = [];
let sampleFunction = (index) => {
return new Promise((resolve, reject) => {
setTimeout(resolve, 100, index);
});
}
indexArray.map((element) => {
promiseArray.push(sampleFunction(element));
});
Promise.all(promiseArray).then((data) => {
// do whatever you want with the results
}).catch((err) => {
//Perform your entire rollback here
});
async.waterfall([
firstFunc,
secondFunc
], function (err, result) {
if (err) {
// delete the entire thing
}
});
Using the async library would give you a much elegant solution than going with chaining.
I'm trying to test this function, which combs through a database for all unfilled currency trades and checks for a price match. If it finds one, database calls take place to close the trade increment the trader's balance. I pass it an array of prices from another function which is just a series of API calls. This is the function in question:
function executeTrade(pricesArr) {
// currencies array must match up one-to-one with pricesArr array
const currencies = ['btc', 'ltc', 'eth', 'doge'];
let chosen;
// Pull trades from database
return Trade.find().then(dbTrades => {
console.log('foo')
// Get only open trades
const openTrades = dbTrades.filter(trade => trade.open);
openTrades.forEach(trade => {
const balance = `${trade.curr_bought}_balance`;
// Get price to compare
if (trade.curr_bought === 'usd') chosen = pricesArr[0];
else {
for (let i = 0; i < currencies.length; i++) {
if (trade.curr_bought === currencies[i]) {
chosen = pricesArr[i];
}
}
}
// Do math depending on buying BTC with USD or something else
if ((trade.curr_bought === 'usd' && trade.sold_amount >= (trade.bought_amount / chosen)) || (trade.sold_amount >= chosen * trade.bought_amount)) {
// Close trade order
return trade.update({$set: { "open": false }})
.then(() => {
// Update user's balance to reflect successful trade
return User.findOne({"_id": trade.owner}).then(user => {
user.update({
$set: {
[balance]: user[balance] + parseFloat(trade.bought_amount)
}
}).then(doc => {
res.json(doc);
}).catch(err => console.log(err));
}).catch(err => console.log(err));
});
}
});
});
};
I'm trying to test it with this test code:
it('Executes a trade if the specified sell prices are greater than or equal to the reported VWAP', done => {
const pricesArr = [0.1, 1, 1, 1];
executeTrade(pricesArr);
app
.get(`/api/trades/?_id=${testTrade._id}`)
.set('Accept', 'application/json')
.expect('Content-Type', /json/)
.expect(200)
.end((err, res) => {
console.log(res.body);
expect(res.body[0].open).to.be.false;
done();
});
});
The problem is that none of the database calls are executed in the test. The function, and all other tests, work fine in the context of a call to the Express server I'm using to make these calls on the actual web app.
I have even tried executing a simple find operation both in the context of an it() function and out, but neither are every executed.
What am I missing, here?
Because your executeTrade function executes asynchronously, there is no guarantee that the asynchronous calls it contains will complete prior to the next function call (i.e. making the API call in your test). Try something like this instead:
executeTrade(pricesArr)
.then(() => {
// make api call and check expected
});
Which ensures that the promise returned by executeTrade settles prior to running the contents of the "then" block. In addition, you are returning promises inside of a forEach construct, which means again that there is no guarantee that the promised results will be settled prior to the return of the executeTrade function. To fix this, I suggest instead using a pattern like:
return Promise.all(openTrades.map((trade) => {
// do stuff here
});
That will only settle once either all promises returned in the map function fulfill or once one of them rejects.
Finally, it looks as if your executeTrades functions could potentially call res.json() multiple times. You can't respond to the same request more than once, and I also don't see where res is defined in your function, so maybe avoid that too.
Google has a nice guide on promises that I suggest you take a look at.
I'm trying to update a tool that was created a while ago which uses nodejs (I am not a JS developer, so I'm trying to piece the code together) and am getting stuck at the last hurdle.
The new functionality will take in a swagger .json definition, compare the endpoints against the matching API Gateway on the AWS Service, using the 'aws-sdk' SDK for JS and then updates the Gateway accordingly.
The code runs fine on a small definition file (about 15 endpoints) but as soon as I give it a bigger one, I start getting tons of TooManyRequestsException errors.
I understand that this is due to my calls to the API Gateway service being too quick and a delay / pause is needed. This is where I am stuck
I have tried adding;
a delay() to each promise being returned
running a setTimeout() in each promise
adding a delay to the Promise.all and Promise.mapSeries
Currently my code loops through each endpoint within the definition and then adds the response of each promise to a promise array:
promises.push(getMethodResponse(resourceMethod, value, apiName, resourcePath));
Once the loop is finished I run this:
return Promise.all(promises)
.catch((err) => {
winston.error(err);
})
I have tried the same with a mapSeries (no luck).
It looks like the functions within the (getMethodResponse promise) are run immediately and hence, no matter what type of delay I add they all still just execute. My suspicious is that the I need to make (getMethodResponse) return a function and then use mapSeries but I cant get this to work either.
Code I tried:
Wrapped the getMethodResponse in this:
return function(value){}
Then added this after the loop (and within the loop - no difference):
Promise.mapSeries(function (promises) {
return 'a'();
}).then(function (results) {
console.log('result', results);
});
Also tried many other suggestions:
Here
Here
Any suggestions please?
EDIT
As request, some additional code to try pin-point the issue.
The code currently working with a small set of endpoints (within the Swagger file):
module.exports = (apiName, externalUrl) => {
return getSwaggerFromHttp(externalUrl)
.then((swagger) => {
let paths = swagger.paths;
let resourcePath = '';
let resourceMethod = '';
let promises = [];
_.each(paths, function (value, key) {
resourcePath = key;
_.each(value, function (value, key) {
resourceMethod = key;
let statusList = [];
_.each(value.responses, function (value, key) {
if (key >= 200 && key <= 204) {
statusList.push(key)
}
});
_.each(statusList, function (value, key) { //Only for 200-201 range
//Working with small set
promises.push(getMethodResponse(resourceMethod, value, apiName, resourcePath))
});
});
});
//Working with small set
return Promise.all(promises)
.catch((err) => {
winston.error(err);
})
})
.catch((err) => {
winston.error(err);
});
};
I have since tried adding this in place of the return Promise.all():
Promise.map(promises, function() {
// Promise.map awaits for returned promises as well.
console.log('X');
},{concurrency: 5})
.then(function() {
return console.log("y");
});
Results of this spits out something like this (it's the same for each endpoint, there are many):
Error: TooManyRequestsException: Too Many Requests
X
Error: TooManyRequestsException: Too Many Requests
X
Error: TooManyRequestsException: Too Many Requests
The AWS SDK is being called 3 times within each promise, the functions of which are (get initiated from the getMethodResponse() function):
apigateway.getRestApisAsync()
return apigateway.getResourcesAsync(resourceParams)
apigateway.getMethodAsync(params, function (err, data) {}
The typical AWS SDK documentation state that this is typical behaviour for when too many consecutive calls are made (too fast). I've had a similar issue in the past which was resolved by simply adding a .delay(500) into the code being called;
Something like:
return apigateway.updateModelAsync(updateModelParams)
.tap(() => logger.verbose(`Updated model ${updatedModel.name}`))
.tap(() => bar.tick())
.delay(500)
EDIT #2
I thought in the name of thorough-ness, to include my entire .js file.
'use strict';
const AWS = require('aws-sdk');
let apigateway, lambda;
const Promise = require('bluebird');
const R = require('ramda');
const logger = require('../logger');
const config = require('../config/default');
const helpers = require('../library/helpers');
const winston = require('winston');
const request = require('request');
const _ = require('lodash');
const region = 'ap-southeast-2';
const methodLib = require('../aws/methods');
const emitter = require('../library/emitter');
emitter.on('updateRegion', (region) => {
region = region;
AWS.config.update({ region: region });
apigateway = new AWS.APIGateway({ apiVersion: '2015-07-09' });
Promise.promisifyAll(apigateway);
});
function getSwaggerFromHttp(externalUrl) {
return new Promise((resolve, reject) => {
request.get({
url: externalUrl,
header: {
"content-type": "application/json"
}
}, (err, res, body) => {
if (err) {
winston.error(err);
reject(err);
}
let result = JSON.parse(body);
resolve(result);
})
});
}
/*
Deletes a method response
*/
function deleteMethodResponse(httpMethod, resourceId, restApiId, statusCode, resourcePath) {
let methodResponseParams = {
httpMethod: httpMethod,
resourceId: resourceId,
restApiId: restApiId,
statusCode: statusCode
};
return apigateway.deleteMethodResponseAsync(methodResponseParams)
.delay(1200)
.tap(() => logger.verbose(`Method response ${statusCode} deleted for path: ${resourcePath}`))
.error((e) => {
return console.log(`Error deleting Method Response ${httpMethod} not found on resource path: ${resourcePath} (resourceId: ${resourceId})`); // an error occurred
logger.error('Error: ' + e.stack)
});
}
/*
Deletes an integration response
*/
function deleteIntegrationResponse(httpMethod, resourceId, restApiId, statusCode, resourcePath) {
let methodResponseParams = {
httpMethod: httpMethod,
resourceId: resourceId,
restApiId: restApiId,
statusCode: statusCode
};
return apigateway.deleteIntegrationResponseAsync(methodResponseParams)
.delay(1200)
.tap(() => logger.verbose(`Integration response ${statusCode} deleted for path ${resourcePath}`))
.error((e) => {
return console.log(`Error deleting Integration Response ${httpMethod} not found on resource path: ${resourcePath} (resourceId: ${resourceId})`); // an error occurred
logger.error('Error: ' + e.stack)
});
}
/*
Get Resource
*/
function getMethodResponse(httpMethod, statusCode, apiName, resourcePath) {
let params = {
httpMethod: httpMethod.toUpperCase(),
resourceId: '',
restApiId: ''
}
return getResourceDetails(apiName, resourcePath)
.error((e) => {
logger.unimportant('Error: ' + e.stack)
})
.then((result) => {
//Only run the comparrison of models if the resourceId (from the url passed in) is found within the AWS Gateway
if (result) {
params.resourceId = result.resourceId
params.restApiId = result.apiId
var awsMethodResponses = [];
try {
apigateway.getMethodAsync(params, function (err, data) {
if (err) {
if (err.statusCode == 404) {
return console.log(`Method ${params.httpMethod} not found on resource path: ${resourcePath} (resourceId: ${params.resourceId})`); // an error occurred
}
console.log(err, err.stack); // an error occurred
}
else {
if (data) {
_.each(data.methodResponses, function (value, key) {
if (key >= 200 && key <= 204) {
awsMethodResponses.push(key)
}
});
awsMethodResponses = _.pull(awsMethodResponses, statusCode); //List of items not found within the Gateway - to be removed.
_.each(awsMethodResponses, function (value, key) {
if (data.methodResponses[value].responseModels) {
var existingModel = data.methodResponses[value].responseModels['application/json']; //Check if there is currently a model attached to the resource / method about to be deleted
methodLib.updateResponseAssociation(params.httpMethod, params.resourceId, params.restApiId, statusCode, existingModel); //Associate this model to the same resource / method, under the new response status
}
deleteMethodResponse(params.httpMethod, params.resourceId, params.restApiId, value, resourcePath)
.delay(1200)
.done();
deleteIntegrationResponse(params.httpMethod, params.resourceId, params.restApiId, value, resourcePath)
.delay(1200)
.done();
})
}
}
})
.catch(err => {
console.log(`Error: ${err}`);
});
}
catch (e) {
console.log(`getMethodAsync failed, Error: ${e}`);
}
}
})
};
function getResourceDetails(apiName, resourcePath) {
let resourceExpr = new RegExp(resourcePath + '$', 'i');
let result = {
apiId: '',
resourceId: '',
path: ''
}
return helpers.apiByName(apiName, AWS.config.region)
.delay(1200)
.then(apiId => {
result.apiId = apiId;
let resourceParams = {
restApiId: apiId,
limit: config.awsGetResourceLimit,
};
return apigateway.getResourcesAsync(resourceParams)
})
.then(R.prop('items'))
.filter(R.pipe(R.prop('path'), R.test(resourceExpr)))
.tap(helpers.handleNotFound('resource'))
.then(R.head)
.then([R.prop('path'), R.prop('id')])
.then(returnedObj => {
if (returnedObj.id) {
result.path = returnedObj.path;
result.resourceId = returnedObj.id;
logger.unimportant(`ApiId: ${result.apiId} | ResourceId: ${result.resourceId} | Path: ${result.path}`);
return result;
}
})
.catch(err => {
console.log(`Error: ${err} on API: ${apiName} Resource: ${resourcePath}`);
});
};
function delay(t) {
return new Promise(function(resolve) {
setTimeout(resolve, t)
});
}
module.exports = (apiName, externalUrl) => {
return getSwaggerFromHttp(externalUrl)
.then((swagger) => {
let paths = swagger.paths;
let resourcePath = '';
let resourceMethod = '';
let promises = [];
_.each(paths, function (value, key) {
resourcePath = key;
_.each(value, function (value, key) {
resourceMethod = key;
let statusList = [];
_.each(value.responses, function (value, key) {
if (key >= 200 && key <= 204) {
statusList.push(key)
}
});
_.each(statusList, function (value, key) { //Only for 200-201 range
promises.push(getMethodResponse(resourceMethod, value, apiName, resourcePath))
});
});
});
//Working with small set
return Promise.all(promises)
.catch((err) => {
winston.error(err);
})
})
.catch((err) => {
winston.error(err);
});
};
You apparently have a misunderstanding about what Promise.all() and Promise.map() do.
All Promise.all() does is keep track of a whole array of promises to tell you when the async operations they represent are all done (or one returns an error). When you pass it an array of promises (as you are doing), ALL those async operations have already been started in parallel. So, if you're trying to limit how many async operations are in flight at the same time, it's already too late at that point. So, Promise.all() by itself won't help you control how many are running at once in any way.
I've also noticed since, that it seems this line promises.push(getMethodResponse(resourceMethod, value, apiName, resourcePath)) is actually executing promises and not simply adding them to the array. Seems like the last Promise.all() doesn't actually do much.
Yep, when you execute promises.push(getMethodResponse()), you are calling getMethodResponse() immediately right then. That starts the async operation immediately. That function then returns a promise and Promise.all() will monitor that promise (along with all the other ones you put in the array) to tell you when they are all done. That's all Promise.all() does. It monitors operations you've already started. To keep the max number of requests in flight at the same time below some threshold, you have to NOT START the async operations all at once like you are doing. Promise.all() does not do that for you.
For Bluebird's Promise.map() to help you at all, you have to pass it an array of DATA, not promises. When you pass it an array of promises that represent async operations that you've already started, it can do no more than Promise.all() can do. But, if you pass it an array of data and a callback function that can then initiate an async operation for each element of data in the array, THEN it can help you when you use the concurrency option.
Your code is pretty complex so I will illustrate with a simple web scraper that wants to read a large list of URLs, but for memory considerations, only process 20 at a time.
const rp = require('request-promise');
let urls = [...]; // large array of URLs to process
Promise.map(urls, function(url) {
return rp(url).then(function(data) {
// process scraped data here
return someValue;
});
}, {concurrency: 20}).then(function(results) {
// process array of results here
}).catch(function(err) {
// error here
});
In this example, hopefully you can see that an array of data items are being passed into Promise.map() (not an array of promises). This, then allows Promise.map() to manage how/when the array is processed and, in this case, it will use the concurrency: 20 setting to make sure that no more than 20 requests are in flight at the same time.
Your effort to use Promise.map() was passing an array of promises, which does not help you since the promises represent async operations that have already been started:
Promise.map(promises, function() {
...
});
Then, in addition, you really need to figure out what exactly causes the TooManyRequestsException error by either reading documentation on the target API that exhibits this or by doing a whole bunch of testing because there can be a variety of things that might cause this and without knowing exactly what you need to control, it just takes a lot of wild guesses to try to figure out what might work. The most common things that an API might detect are:
Simultaneous requests from the same account or source.
Requests per unit of time from the same account or source (such as request per second).
The concurrency operation in Promise.map() will easily help you with the first option, but will not necessarily help you with the second option as you can limit to a low number of simultaneous requests and still exceed a requests per second limit. The second needs some actual time control. Inserting delay() statements will sometimes work, but even that is not a very direct method of managing it and will either lead to inconsistent control (something that works sometimes, but not other times) or sub-optimal control (limiting yourself to something far below what you can actually use).
To manage to a request per second limit, you need some actual time control with a rate limiting library or actual rate limiting logic in your own code.
Here's an example of a scheme for limiting the number of requests per second you are making: How to Manage Requests to Stay Below Rate Limiting.