Stop loop when getting a specific text response from server - node.js

I'm working with some API server that communicates by XML.
I need to send, let's say: 20 identical POST requests.
I'm writing this in Node JS.
Easy.
BUT - since I'm going to multiply the process, and I want to avoid flooding the server (and getting kicked), I need to break the sending loop IF the (XML) response contains a specific text (a success signal): <code>555</code>, or actually just '555' (the text is wrapped with other XML phrases).
I tried to break the loop based on the success signal AND also tried "exporting" it outside the loop (Thinking it could be nice to address it in the loop's condition).
Guess it's easy but being a newbie, I had to call for some help :)
Attaching the relevant code (simplified).
Many thanks !
const fetch = require("node-fetch");
const url = "https://www.apitest12345.com/API/";
const headers = {
"LOGIN": "abcd",
"PASSWD": "12345"
}
const data = '<xml></xml>'
let i = 0;
do { // the loop
fetch(url, { method: 'POST', headers: headers, body: data})
.then((res) => {
return res.text()
})
.then((text) => {
console.log(text);
if(text.indexOf('555') > 0) { // if the response includes '555' it means SUCCESS, and we can stop the loop
~STOP!~ //help me stop the loop :)
}
});
i += 1;
} while (i < 20);

Use simple for loop with async await.
const fetch = require("node-fetch");
const url = "https://www.apitest12345.com/API/";
const headers = {
"LOGIN": "abcd",
"PASSWD": "12345"
}
const data = '<xml></xml>'
for (let i = 0; i < 20; i++) {
const res = await fetch(url, { method: 'POST', headers: headers, body: data});
if (res.text().indexOf('555') !== -1)
break;
}

Related

Node JS - Increasing latency for get requests inside a map method

I have a fairly straightforward application which 1) accepts an array of urls 2) iterates over those urls 3) makes a get request to stream media (video / audio). Below is a code snippet illustrating what I'm doing
import request from 'request';
const tasks = urlsArray.map(async (url: string) => {
const startTime = process.hrtime();
let body = '';
let headers = {};
try {
const response = await promisify(request).call(request, {url, method: 'GET',
encoding: null, timeout: 10000});
} catch (e) {
logger.warn(failed to make request)
}
const [seconds] = process.hrtime(startTime);
logger.info(`Took ${seconds} seconds to make request`)
return body;
}
(await Promise.all(tasks)).forEach((body) => {
// processing body...
}
What I'm currently experiencing is that the time to make the request keeps rising as I make more requests and I'm struggling to understand why that is the case.

Can one batch requests with google-auth-library?

The way I have been using the client thus far has been like
client = new OAuth2Client(
process.env.GOOGLE_CLIENT_ID,
process.env.GOOGLE_CLIENT_SECRET,
"http://localhost:5000/oauth2callback"
);
client.setCredentials({refresh_token: getRefreshToken(user)});
let url = 'https://www.googleapis.com/gmail/v1/users/me/messages?q=myquery';
client.request({url}).then((response) => {
doSomethingWith(response);
});
Since response holds a list of message ids, I will have to use users.messages.get to get the actual data for each message. I would prefer not to do hundreds of separate requests just for one query. Is there a way to batch the users.messages.get requests?
You can use Google APIs batching requests feature. Here's an example function that accepts the array of message IDs and the client as params and then does a batch request to the users.messages.get endpoint.
const batchGetMessages = (messageIds = [], oAuth2Client) => {
const url = 'https://www.googleapis.com/batch/gmail/v1';
const boundary = 'message_batch_demo';
const headers = {
'Content-Type': `multipart/mixed; boundary=${boundary}`
};
let data = '';
for (const messageId of messageIds) {
data += `--${boundary}\r\nContent-Type: application/http\r\n\r\n`;
data += `GET /gmail/v1/users/me/messages/${messageId}`;
data += '\r\n';
}
data += `--${boundary}--`;
return oAuth2Client.request({
url,
headers,
data,
method: 'POST'
});
};

How to make api requests with a timer in node js?

I have an array which contains some data and for each data item an api request has to be made. The api will remain same but the array index will increment every time the api request is made.
Also the api request has to be called with a gap of 5 minutes. Hence I can't call the api for the entire array all at once. One api call is made with Array[0] in the request body and after 5 minutes api call is made with Array[1] in the request body.
I tried to implement a cron job with these requirements but there are no proper examples for a cron job within a for loop with api calls.
Any help would be appreciated.
`
const array = ['http://linkedin.com/charles123', 'http://linkedin.com/darwin123' ... ]
//API needs to be called every 5 minutes
const sendConnectionRequest = () => {
var i = 0;
for(i; i< array.length, i++) {
fetch("serverurl:123", {
headers: {
'Content-Type': 'application/json'
},
method: "POST",
body: JSON.stringify(array[i])
})
.then((res) => if(res) { console.log('Connection Request Send') } )
}
}`
May I suggest using an Async Generator this will allow you to manage sequential promises.
const fetch = require("node-fetch");
const sleep = require("util").promisify(setTimeout);
async function* responseGenerator(urls) {
let iterations = 0;
while (urls.length) {
const [url, ...rest] = urls;
urls = rest;
if (iterations > 0) {
await sleep(50000);
}
yield fetch("serverurl:123", {
headers: {
"Content-Type": "application/json"
},
method: "POST",
body: JSON.stringify(url)
});
iterations += 1
}
}
const array = ['http://linkedin.com/charles123', 'http://linkedin.com/darwin123' ]
for await (const response of responseGenerator(array)) {
// response.status
// response.statusText
// response.contentType
}
There are multiple way of doing timers in node.js. Check this link.
setInterval is a infinite loop and between each iteration it wait a certain amount of time.
const array = ['http://linkedin.com/charles123', 'http://linkedin.com/darwin123' ... ]
//API needs to be called every 5 minutes
const sendConnectionRequest = (data) => {
fetch("serverurl:123", {
headers: {
'Content-Type': 'application/json'
},
method: "POST",
body: JSON.stringify(data)
})
.then((res) => if(res) { console.log('Connection Request Send') } )
}
const callApi = setInterval(()=> {
sendConnectionRequest(array[0])
array.shift()
}, 30000);

how to loop the url in options in nodejs

var request = require('request');
var options = {
'method': 'GET',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=1',//To get all the users data from the repos
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=2',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=3',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=4',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=5',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=6',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=7',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=8',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=9',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=10',
'url': 'https://api.github.com/orgs/organizationName/repos?per_page=100&page=11',
'headers': {
'Accept': 'application/vnd.github.mercy-preview+json',//to get topics of the repos
'Authorization': 'Bxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx',
'User-Agent' : 'sxxxxxxxxxxxxx'
}
};
request(options, function (error, response) {
if (error) throw new Error(error);
console.log(response.body);
});
In this above code i want to loop the urls continuously until the end of the page
if not anyone have the idea of using pagination in this help me out
cYou cannot have multiple attributes for one object key. You have to call every url individually. I tried to solve this using asyncronous code, because looping with callback functions is confusing and dangerous with regard to the call stack.
const request = require('request');
// configuration for the url generation
const perPages = 100;
const startPage = 1;
const endPage = 11;
const url = 'https://api.github.com/orgs/organizationName/repos?per_page=%perPages%&page=%page%';
// define a asyncronous call for one url
async function callOneUrl(url) {
// local options for each url
const options = {
method: 'GET',
url: url,
headers: {
Accept: 'application/vnd.github.mercy-preview+json',
Authorization: 'Bxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx',
'User-Agent': 'sxxxxxxxxxxxxx'
}
}
return new Promise((resolve, reject) => {
request(options, function (error, response) {
if (error) return reject(error);
resolve(response);
});
});
}
// call each url with a for loop
(async () => {
for (let i = startPage; i <= endPage; i++) {
// using the await statement to get the resolved value of the Promise instance or catch the error
try {
var response = await callOneUrl(url.replace('%perPages%', perPages).replace('%page%', i));
// handle response here
console.log(response.body);
} catch (error) {
// handle errors here
throw new Error(error);
}
}
})()
const request = require('request-promise');
const urls = ["http://www.google.com", "http://www.example.com"];
const promises = urls.map(url => request(url));
Promise.all(promises).then((data) => {
// data = [promise1,promise2]
});
Apart from above you can also use async.eachseries or async.parallel etc..
You can download a list of repos with a do...while loop. We'll set a maximum number of pages to download and exit when we reach either this or the last page.
I would suggest using the request-promise-native package to allow us to use the very nice async-await syntax.
Now, I've given the example of downloading repos for the mongodb org. You can easily replace with whatever one you wish.
I would also note that the request library is now deprecated, we can use it of course, but we must consider replacing in the future.
We now also log the repo information and save it to the output file.
const rp = require("request-promise-native");
const fs = require("fs");
async function downloadRepoInformation(org, outputFile) {
let repoList = [];
let page = 0;
const resultsPerPage = 20;
const maxPages = 10;
const uri = `https://api.github.com/orgs/${org}/repos`;
do {
try {
let response = await rp.get({ uri, json: true, qs: { per_page: resultsPerPage, page: ++page }, headers: {"User-Agent" : "request"} });
console.log(`downloadRepoInformation: Downloaded page: ${page}, repos: ${response.length}...`);
repoList = repoList.concat(response);
console.log("downloadRepoInformation: response", JSON.stringify(response, null, 2));
console.log("downloadRepoInformation: repoList.length:", repoList.length);
if (response.length < resultsPerPage) {
console.log(`downloadRepoInformation: Last page reached: exiting loop...`);
break;
}
} catch (error) {
console.error(`downloadRepoInformation: An error occurred:`, error);
break;
}
} while (page <= maxPages)
console.log("downloadRepoInformation: download complete: repoList.length:", repoList.length)
console.log("downloadRepoInformation: Saving to file:", outputFile);
fs.writeFileSync(outputFile, JSON.stringify(repoList, null, 4));
}
downloadRepoInformation("mongodb", "./repolist.json");

Stream data from variable to PUT request

What I've been looking to do is make a GET request and then manipulate the JSON data store that in a variable and then make a PUT request. Can't seem to find documentation on this. Maybe I am thinking about this wrong. Once I have the variable I want to do something like the below. I have all of my data from the GET request saved to an outputV3.json file.
var outputJson = fs.readFileSync("outputV3.JSON");
outputJson = JSON.parse(outputJson);
(function () {
for (let i = 0; i < outputJson.objects.length; i++) {
let postId = outputJson.objects[i].id.toString();
let newSlug = outputJson.objects[i].slug.replace("blog/", "");
let urlToPut = "https://api.hubapi.com/content/api/v2/blog-posts?limit=1000&hapikey=" + process.env.HAPIKEY;
urlToPut = urlToPut.replace("blogPostId", postId);
let put_data = JSON.stringify({
"slug": newSlug
});
put_data.put(urlToPut);
}
});
If you need to stream your data from file to request, you should create read stream from the file and pipe it to the destination:
const dataStream = fs.createReadStream('outputV3.JSON');
const options = {
hostname: 'www.example.com',
port: 80,
path: '/destination',
method: 'PUT',
headers: {
'Content-Type': 'application/json',
}
};
const req = http.request(options, (res) => {
// response processing...
});
dataStream.pipe(req);
And if you need more advanced logic for streaming you should consider putting a custom Transform stream between readable file stream and writable request stream.
If I wanted to use request-promise could I do something like the following
let options = {
uri: 'https://api.hubapi.com/content/api/v2/blog-posts?limit=1000&hapikey=' + process.env.HAPIKEY,
method: 'GET'
transform: function (body, response) {
return for(var i=0; i<outputJson.objects.length; i++) {
var postId = outputJson.objects[i].id.toString();
var newSlug = outputJson.objects[i].slug.replace("blog/","");
}
};
rp(options)
.then(function(removedSlug) {
.pipe(request.put('https://api.hubapi.com/content/api/v2/blog-
posts/blogPostId?hapikey=' + process.env.HAPIKEY))
});

Resources