Here I am making some http requests on links which I filtered in advance. The links are inside an array in string format.
for (let link of linkArray) {
const linkOptions = {
uri: link,
resolveWithFullResponse: true,
};
linkRequest = await rp(linkOptions);
//some other actions with linkRequest response
}
But of course the request statuses of the links can be some bad ones (for example 404 forbidden errors, status codes basically grater than 300).
Here I don't want the program to be crushed, instead I want to skip the response that is forbidden and continue checking remaining links of linkArray.
What I have tried so far:
below the comment I checked the statusCodes, if it's some bad one than continue loop iteration.
if (assetsRequest.statusCode === 204 || assetsRequest.statusCode >= 300) continue
But the request is failing when it's is bad request, in any case without waiting for my checking. Any ideas how to handle this?
p.s. I am using request-promise module
The program crashes because you do not catch the error.
You should always catch errors, especially if you do error prone process such as http request.
Here i've added try catch
for (let link of linkArray) {
const linkOptions = {
uri: link,
resolveWithFullResponse: true,
};
try {
linkRequest = await rp(linkOptions);
} catch(err) {
// do something with error or at least log it
console.log(err);
}
//some other actions with linkRequest response
}
So when rp() fails, it will go straight to the catch block but does not quit the loop, hence the process continue for other link in the array.
Related
Is there possible way to handle error that will return right back after error is triggered?
if(command === 'test') {
message.author.send('Dm Test').catch(error => {
message.reply('I failed to send you a private message!')
return;
})
//some code below, should not trigger after sending message error.
The problem is that .catch will respond as last, how to handle this error and immediately return instead of running also code below? I tried to use try { but that didn't work.
message.author.send('👌')
.catch(() => message.reply("Can't send DM to your user!"));
Would like to know if there is another way to handle error. Any help will be much appreciated.
The reason why .catch() executes after the rest of your code is because .send() is asynchronous and returns a Promise. Think of it this way: each time you send a message, discord.js has to send an HTTP request to the Discord API, wait for a response, and then give you the data from the response. That process is not instantaneous and takes some time, which is why using the Promise structure is very useful.
As for the specific problem, you simply want to await the response of your .catch(). This can be done by making the function you are running this code in async. Here is an example:
client.on("messageCreate", async (message) => {
let response = await message.author.send('👌').catch(() => {
message.reply("Can't send DM to your user!"));
return false;
});
// (If error occurred, 'response' will be false)
if (!response) return; // Return if the error occurred
// ... Code for if error did not occur
// (If no error occurred, 'response' will contain sent message)
});
The await keyword will wait until the asynchronous line you are executing has been fulfilled (i.e. until the return value has been obtained or an error has been caught) before continuing on with your code. Note the use of the async keyword. You can only use await within functions marked async.
If you place a return before sending the error message, JavaScript will read that as you're returning the message so if you do the following:
message.author.send('👌')
.catch(() => return message.reply("Can't send DM to your user!"));
you'll have the error message be the final command run and nothing else will follow.
You can use try inside of if & else statement to know if you can dm the person or not.
if(command === 'test') {
try {
message.author.send("Something right here")
} catch {
message.channel.send("I can't dm this person.")
} else {
//something exploration code here
}
}
i've worked with node now for 2 years but cannot solve the following requirements:
I have an array of ~ 50.000 Parameters
I need to loop through the array and make a get request to always the same url with the parameter added
I need to write the result of the url-call back to the array
It's needed to do this one by one, as i can not call the api with several threads.
I'm sure there is a simple solution for that but everything i tried didn't make the code wait for the get request to return. I know that doing things synchronous in node is not the way we should to things, but in this special situation it is by design that the process shall not go on till the result comes back.
Any hint appreciated
Regards
Use a for loop, use a means of doing the GET request that returns a promise (such as the got() library) and then use await to pause the for loop until your response comes back.
const got = require('got');
const yourArray = [...];
async function run() {
for (let [index, item] of yourArray.entries()) {
try {
let result = await got(item.url);
// do something with the result
} catch(e) {
// either handle the error here or throw to stop further processing
}
}
}
run().then(() => {
console.log("all done");
}).catch(err => {
console.log(err);
});
I'm new to Node and the async programming model. I'm having problems dealing with a simple requirement that seems pretty basic in synchronous environments: paging through an API response until the response is empty.
More specifically, the API, on a successful call, will return data and a status of 200 or 206 (partial content). If I see the 206 response, I need to keep making calls to the API (also sending a page query param that I increment each time) until I see the 200 response.
In a synchronous language, the task will be a piece of cake:
// pseudocode
data = []
page = 1
do {
response = api.call(page)
data.append(response.data)
page++
} while (response != 200)
return data
Now, in Node, for a single api call, code like this will work:
// fire when '/' has a GET request
app.get('/', (req, res) => {
axios.get('https://api.com/v1/cats')
.then(response => {
// now what??
});
});
});
See the //now what?? comment? That's the point where I'm wondering how to proceed. I came across this somewhat-relevant post but am not able to convert this to a format that will work for me in Node and Axios.
Is it enough to just wrap the axios code in a separate function? I don't think so, because if I do this:
function getData(pageNum) {
axios.get('https://api.com/v1/cats')
.then(response => {
// now what??
});
});
}
I can't rely on a return value because as soon axios.get() gets executed, the function will be over. I can call getData() again after I get the first response, but then, suppose I want to return all the data from these multiple calls as the HTTP response from my Express server . . . how do I do that?
I hope I will not get downvoted for laziness or something. I've really looked around but not found anything relevant.
First, a counter-question: Is the data set so big that you need to worry about using up all the memory? Because if so then it will take more work to structure your code in a way that streams the data all the way through. (In fact I'm not even sure whether express allows streaming... you are using express aren't you?)
From the axios documentation, it looks like response is a readable stream which provides the response body. So reading it is also an asynchronous task. So you should write a function that does that. See the "Stream" page of the nodejs docs for more details. Or I could be persuaded to help with that too, time permitting. But for now, I'll assume you have a function readResponse, which takes an axios response object as an argument and returns a promise, and the promise resolves to an object such as { statusCode: 206, result: ['thing1', 'thing2'] }. I'll also assume that your goal is to get all the result arrays and concatenate them together to get e.g. ['thing1', 'thing2', 'thing3', 'thing4', 'thing5', 'thing6'].
You could write a self-calling version of your getData function. This will retrieve all data from a given page onwards (not just the page itself):
function getData(pageNum) {
axios.get('https://api.com/v1/cats' + (pageNum ? '?page=' + pageNum) : '')
.then(readResponse)
.then(function(parsedResponse) {
if(parsedResponse.statusCode == 200) {
return parsedResponse.result;
} else if(parsedResponse.statusCode == 206) {
return getData(pageNum + 1).then(function(laterData) {
return parsedResponse.result.concat(laterData);
});
} else {
// error handling here, throw an exception or return a failing promise.
}
});
});
}
Then, to get all data, just call this function with pageNum = 0:
// fire when '/' has a GET request
app.get('/', (req, res) => {
getData(0)
.then(function(results) {
// results is now the array you want.
var response = JSON.stringify(results); // or whatever you're doing to serialise your data
res.send(response);
});
});
I am developing a script on Node.js that sends a lot of requests to an API. After several requests (more than 380 requests), we receive the following error message : Error: socket hang up (code:ECONNRESET). This is a big issue for our script since we would like to send around 10000 requests.
This is not an issue with the rate limit of the API because we are already handling this.
Our script is running on OVH server, and we send our requests using the package request-promise. Our version of Node.js is v 9.9.0.
Here is the function where the error is thrown :
const pollSession = async (sessionUrl) => {
let session;
try {
session = await rp.get({ url: sessionUrl, json: true }, (err, res, body) => {
if (err) {
console.log('Err: ', err);
} else {
DEBUG && console.log("Status code: ",res && res.statusCode);
DEBUG && console.log("Status: ",res && res.body && res.body.Status);
statusCode = res && res.statusCode;
status = res && res.body && res.body.Status;
}
});
} catch (e) {
console.log ("----- pollSession : in catch with return value :"+e);
return e;
}
return session;
}
When the request is working, we are calling this function few times in order to get the full response (because the response is huge).
When the error "Err: { Error: socket hang up" is thrown, we are calling the function again and it returns this error again. We can't afford to give up on those requests so we would like to know how to work around this error. Maybe it is possible to increase the max number of sockets (I saw it was possible with http agent, but we are using request-promise package) ?
Let me know if you need further information
After a lot of tests, I find out that this is related to the API I am sending requests to, Skyscanner for the record. Some flights I am searching are too long to be retrieved and lead to this error. Fixed this issue by catching the error.
I'm trying to pipe an image stored on Amazon S3 using node-request. However, sometimes an image doesn't exist, which is exposed by S3 as a status code 403. I'm struggling to see how I can pipe in case of success (200) but take an alternative action in case of a non-200.
Using the abort() method seemed like the way to go but getting an r.abort is not a function, even though it should be available on the request.
// context: inside a request handler, where `res` is the response to write to
const r = request(url)
.on('response', function (response) {
if (response.statusCode !== 200) {
r.abort(); //failing
// I want to stop the piping here and do whatever instead.
}
})
.pipe(res);
Thoughts?
To answer my own question: don't start piping until sure it's correct:
request(url)
.on('response', function (r) {
if (r.statusCode !== 200) {
//take other action
}
r.pipe(res);
})