Sporadic console message - Request failed with status code 429 - node.js

I have this error or warning that seems to come up sporadically in the console. In the browser, the request will just hang a little and then eventually load, or eventually timeout. Other times the page will load normally.
Error
getWeather: Error: Request failed with status code 429
For brevity, this is part of my code that seems to produce the error
// get the weather from open weather map
let getWeather = new Promise(function(resolve, reject) {
getCity.then(
apiData => {
axios.all([
axios.get(apiData.weatherUrl),
axios.get(apiData.imageUrl)
])
.then(axios.spread((weatherRes, imageRes) => {
const weather = weatherRes.data;
apiData.weatherForcast = `It's ${weather.main.temp} degrees Celcius in ${weather.name}`;
const imageApiData = imageRes.data;
apiData.largeImageURL = imageApiData.hits[0]['largeImageURL'];
resolve(apiData);
}))
.catch(function (error) {
console.error("getWeather: "+error)
})
},
error => {
reject(error);
res.end("Could not query the get weather");
}
)
});
The error code is related to too many requests. What's causing this and how do I fix it? Perhaps you could dumb it down a little because I'm a bit new to Node. Thanks.

I hope this is not too late to answer your question, but I encountered the same issue when working on a project so perhaps this might still be of use to you.
The code 429 translates to a client error: Too Many Requests.
The API you are trying to call is probably limiting you to do an x amount of requests per hour and you have exceeded it. Check if you are calling the API multiple times, if that is the case you could try to call it just once and place all the data you need into an object or other sorts of data structure then you can just work with that data instead of calling the API.
Hope this helps!

Related

got some error massage in console when I try to update my field value in mongodb through express

My node app will crash when i send req for update the field value in my mongo db. The data will updated successfully, But The message will no show which is i provided in (
(err) => {
if (err) {
res.status(500).json({
error: "There was an error in server side!",
});
} else {
res.status(200).json({
message: "value updated successfully!",
});
}
}
)
Instead of showing above message. the mongo sent me (const err = new MongooseError('Query was already executed: ' + str);). this message and more :
MongooseError: Query was already executed: Todo.updateOne({ _id: new ObjectId("6243ed2b5e0bdc9ab780b4d9...
But I use each and every time different id and update with differen message.
when i check in db is that the old value updated or not, but nicely the old value updated. but no message or any thing i can't see in postman resposnse.Also in my console the mongodb throw me above error.
What ever happened I want to see my predefined err messages or successfully messages.
Finally, I got and understood the answer. Actually, I am a beginner programmer and developer that's why I made this problem and it took a lot of time to solve. I solved the problem within 3-5 minutes by removing async/await but I took a lot of time to dig out why it's working after removing async/await. Here is a basic concept of asynchronous programming if we use async/await we don't have to use callback again, Or if we use callback we don't need async/await Because it's just redundant. So, if we want to get data like we are getting from
callback(err,data) =>{
res.send(data);
}
then we just assigned our full thing in a variable like:
try{
const data=await (our code);
res.send(data);
}catch(err){
res.send(err.message);
}

Getting TRIGGER_PAYLOAD_TOO_LARGE error on simple Firebase Cloud Function

I am getting this error when a cloud function is executed:
Error: TRIGGER_PAYLOAD_TOO_LARGE: This request would cause a function payload exceeding the maximum size allowed.
This is the cloud function causing the error:
exports.inyectSerie = functions.database.ref('forms/{pushId}').onCreate(event => {
if (!admin.apps.length) {
admin.initializeApp();
}
var form = event.val();
var formData = {
serie: form.serie
};
admin.database().ref('series/'+form.serie).set(formData);
});
How do I know this is the function causing the error? I removed all cloud functions from my firebase and it worked as expected. Then I put back this inyectSerie function and it gave me the error again.
This is my firebase structure, being "medidores" node the one with the most data, with 150k records (which doesn't sound like a lot to me):
+fallidas
+forms <-- This has only 20 records
+materiales
+medidores <-- This has 150,000+ records
+series
+users
If you notice, medidores node is never touched on the cloud function.
I searched for the error and only found this other question reporting it, but I think the cloud function causing the problem on that case did access all the records on the db.
The only thing that comes to my mind to be the problem on my case, is that functions.database loads the whole database no matter what.
UPDATE: Even after reducing my trigger to a bare minimum (thanks, James Poag), I am getting the same error.
exports.inyectSerie = functions.database.ref('forms/{pushId}').onCreate(event => {
return null;
});

Error : Socket hang up on multiple get requests

I am developing a script on Node.js that sends a lot of requests to an API. After several requests (more than 380 requests), we receive the following error message : Error: socket hang up (code:ECONNRESET). This is a big issue for our script since we would like to send around 10000 requests.
This is not an issue with the rate limit of the API because we are already handling this.
Our script is running on OVH server, and we send our requests using the package request-promise. Our version of Node.js is v 9.9.0.
Here is the function where the error is thrown :
const pollSession = async (sessionUrl) => {
let session;
try {
session = await rp.get({ url: sessionUrl, json: true }, (err, res, body) => {
if (err) {
console.log('Err: ', err);
} else {
DEBUG && console.log("Status code: ",res && res.statusCode);
DEBUG && console.log("Status: ",res && res.body && res.body.Status);
statusCode = res && res.statusCode;
status = res && res.body && res.body.Status;
}
});
} catch (e) {
console.log ("----- pollSession : in catch with return value :"+e);
return e;
}
return session;
}
When the request is working, we are calling this function few times in order to get the full response (because the response is huge).
When the error "Err: { Error: socket hang up" is thrown, we are calling the function again and it returns this error again. We can't afford to give up on those requests so we would like to know how to work around this error. Maybe it is possible to increase the max number of sockets (I saw it was possible with http agent, but we are using request-promise package) ?
Let me know if you need further information
After a lot of tests, I find out that this is related to the API I am sending requests to, Skyscanner for the record. Some flights I am searching are too long to be retrieved and lead to this error. Fixed this issue by catching the error.

Using redis as cache as REST Api user (in order to save Api requests)

I am a API user and I have only a limited number of requests availble for a high traffic website (~1k concurrent visitors). In order to save API requests I would like to cache the responses for specific requests which are unlikely to change.
However I want to refresh this redis key (the API response) at least every 15 seconds. I wonder what the best approach for this would be?
My ideas:
I thought the TTL field would be handy for this scenario. Just set a TTL of 15s for this key. When I query this key and it's not present I would just request it again using the API. The problem: Since this is a high traffic website I would expect around 20-30 requests until I've got a response from the API and this would lead to 20-30 requests to the API within a few ms. So I would need to "pause" all incoming requests until there is a API response
My second idea was to refresh the key every 15s. I could set a background task which runs every 15s or upon page request I could check in my controller if the key needs a refresh. I would prefer the last idea but therefore I would need to maintain the redis key age and this seems to be very expensive and it is not a built in feature?
What would you suggest for this use case?
My controller code:
function players(req, res, next) {
redisClient.getAsync('leaderboard:players').then((playersLeaderboard) => {
if(!playersLeaderboard) {
// We need to get a fresh copy of the playersLeaderboard
}
res.set('Cache-Control', 's-maxage=10, max-age=10')
res.render('leaderboards/players', {playersLeaderboard: playersLeaderboard})
}).catch((err) => {
logger.error(err)
})
}
Simply fetch and cache the data when the node.js server starts and then set an interval for 15 seconds to fetch fresh data and update cache. Avoid using the TTL for this usecase.
function fetchResultsFromApi(cb) {
apiFunc((err, result) => {
// do some error handling
// cache result in redis without ttl
cb();
});
}
fetchResultsFromApi(() => {
app.listen(port);
setInterval(() => {
fetchResultsFromApi(() => {});
}, 15000);
}
Pros:
Very simple to implement
No queuing of client request required
Super fast response times
Cons:
The cache update might not execute/complete exactly after every 15th second. It might be a few milliseconds here and there. I assume that it won't make a lot of difference for what you are doing and you can always reduce the interval time to update cache before 15 seconds.
I guess this is more of an architecture question than those typical "help my code don't work" kind.
Let me paraphrase your requirements.
Q: I would like to cache the responses of some HTTP requests which are unlikely to change and I would like these cached responses to be refreshed every 15 seconds. Is it possible?
A: Yes it is and you're so going to thank the fact that Javascript is single threaded so it is going to be quite straight forward.
Some fundamental knowledge here. NodeJS is an event driven framework which means that at 1 point in time it is going to execute only one piece of code, all the way until it is done.
If any aysnc call is encountered along the way, it will call them and add an event to the event-loop to say "callback when a response is received". When the code routine is finished then it will pops the next event from the queue to run them.
Based on this knowledge, we know we can achieve this by building a function to only fire-off 1 async call to update the cached-responses everytime it expires. If an async call is already in action, then just put their callback functions into a queue. This is so that you don't do multiple async calls to fetch the new result.
I'm not familiar with the async module so I have provided an pseudo code example using promises instead.
Pseudo code:
var fetch_queue = [];
var cached_result = {
"cached_result_1": {
"result" : "test",
"expiry" : 1501477638 // epoch time 15s in future
}
}
var get_cached_result = function(lookup_key) {
if (cached_result.hasOwnProperty(lookup_key)) {
if (result_expired(cached_result[lookup_key].expiry)) {
// Look up cached
return new Promise(function (resolve) {
resolve(cached_result[lookup_key].result);
});
}
else {
// Not expired, safe to use cached result
return update_result();
}
}
}
var update_result = function() {
if (fetch_queue.length === 0) {
// No other request is retrieving an updated result.
return new Promise(function (resolve, reject) {
// call your API to get the result.
// When done call.
resolve("Your result");
// Inform other requests that an updated response is ready.
fetch_queue.forEach(function(promise) {
promise.resolve("Your result");
})
// Compute the new expiry epoch time and update the cached_result
})
}
else {
// Create a promise and park it into the queue
return new Promise(function(resolve, reject) {
fetch_queue.push({
resolve: resolve,
reject: reject
})
});
}
}
get_cached_result("cached_result_1").then(function(result) {
// reply the result
})
Note: As the name suggested the code is not actual working solution but the concept is there.
Something worth noting is, setInterval is 1 way to go but it doesn't guarantee that the function will get called exactly at the 15 second mark. The API only make sure that something will happen after the expected time.
Whereas the proposed solution will ensure that as long as the cached result has expired, the very next person looking it up will do a request and the following requests will wait for the initial request to return.

AngularJS Mongoose error handling

I've build a simple application with AngularJS. Part of this application is calling REST services. For that, I'm using mongoose. Everything works great, but I'd like to better handle errors. A sample code could be :
Express:
DBCollection.find({}, function (err, tuples) {
if (err) {
console.log('Error!');
}
res.send(JSON.stringify(tuples));
});
AngularJS:
DBService.query(function (res) {
$scope.data.lists = res;
});
The problem I'm faced with is as follow. Imagine I've got an error on the mongodb server side. I've got an error, so I log it in the console and then ? What happens on the angularjs/front-end side ? If I send the error as a http response, I suppose angular would interpreate it as the response of the query but with unexpected content and produce an exception ? how to deal with that ?
Angular is like Santa, it knows when responses are bad or good. There are 2 solutions, ones solutions is to create an error handler on each request. The other is to use $httpProvider.interceptors to globally handle errors before they become a problem on an individual request level.
Option 1
DBService.query(function (res) {
scope.data.lists = res;
},function(errorResult){
console.log(errorResult); // <- take a peek in here, find something useful
});
Option 2
$httpProvider.interceptors.push(['$q',function($q) {
return {
'responseError': function(rejection) {
console.log(rejection); // <- take a peek in here, find something useful
return $q.reject(rejection);
}
}
}]);

Resources