Can one batch requests with google-auth-library? - node.js

The way I have been using the client thus far has been like
client = new OAuth2Client(
process.env.GOOGLE_CLIENT_ID,
process.env.GOOGLE_CLIENT_SECRET,
"http://localhost:5000/oauth2callback"
);
client.setCredentials({refresh_token: getRefreshToken(user)});
let url = 'https://www.googleapis.com/gmail/v1/users/me/messages?q=myquery';
client.request({url}).then((response) => {
doSomethingWith(response);
});
Since response holds a list of message ids, I will have to use users.messages.get to get the actual data for each message. I would prefer not to do hundreds of separate requests just for one query. Is there a way to batch the users.messages.get requests?

You can use Google APIs batching requests feature. Here's an example function that accepts the array of message IDs and the client as params and then does a batch request to the users.messages.get endpoint.
const batchGetMessages = (messageIds = [], oAuth2Client) => {
const url = 'https://www.googleapis.com/batch/gmail/v1';
const boundary = 'message_batch_demo';
const headers = {
'Content-Type': `multipart/mixed; boundary=${boundary}`
};
let data = '';
for (const messageId of messageIds) {
data += `--${boundary}\r\nContent-Type: application/http\r\n\r\n`;
data += `GET /gmail/v1/users/me/messages/${messageId}`;
data += '\r\n';
}
data += `--${boundary}--`;
return oAuth2Client.request({
url,
headers,
data,
method: 'POST'
});
};

Related

Axios Async and Await Loop trough Post Request with Value from Array with Objects in nodejs

I am trying to call an existing API Endpoint with a post request multiple times in nodejs. For that I am using a json file with multiple hundred entries, a loop and axios post request. The content of the file in itself changes but not the setup as it is based on a template.
I am now running into the issue that it has to be in a synchronous manner (thats why the await) as the used file has an inherent order which would have to be manually change in the application of the api endpoint. So an array of [Object 1, Object 2, Object 3] schould be created in that order even if it is not the fastest way possible.
So for my riskPostRequest function I am unable to set the title in data in a way that it will be pulled from the array of objects during my async importRisk function.
const jsonString = fs.readFileSync(filepath, 'utf-8')
const myReadStreamRisks = JSON.parse(jsonString)
console.log(myReadStreamRisks)
const riskPostRequest = risk => {
axios({
method: 'post',
port: 443,
url: baseURL + '/Risks',
data: {
parentFolderId: 11633,
title: myReadStreamRisks.title
}
})
.then ((response) => {
console.log(response.data.title,';', response.data.id)
})
.catch(function (error) {
console.log(error.message);
}, myReadStreamRisks)
}
// Import the TeamStoreRisks
const importRisks = async () => {
console.log('Start')
for (var i = 0; i < myReadStreamRisks.length; i++) {
const risk = myReadStreamRisks[i]
const riskPost = await riskPostRequest(risk);
}
}
importRisks()

Node JS - Increasing latency for get requests inside a map method

I have a fairly straightforward application which 1) accepts an array of urls 2) iterates over those urls 3) makes a get request to stream media (video / audio). Below is a code snippet illustrating what I'm doing
import request from 'request';
const tasks = urlsArray.map(async (url: string) => {
const startTime = process.hrtime();
let body = '';
let headers = {};
try {
const response = await promisify(request).call(request, {url, method: 'GET',
encoding: null, timeout: 10000});
} catch (e) {
logger.warn(failed to make request)
}
const [seconds] = process.hrtime(startTime);
logger.info(`Took ${seconds} seconds to make request`)
return body;
}
(await Promise.all(tasks)).forEach((body) => {
// processing body...
}
What I'm currently experiencing is that the time to make the request keeps rising as I make more requests and I'm struggling to understand why that is the case.

Firebase cloud function: http function returns null

Here is what I am trying to do.
I am introducing functionality to enable users to search for local restaurants.
I created a HTTP cloud function, so that when the client delivers a keyword, the function will call an external API to search for the keyword, fetch the responses, and deliver the results.
In doing #2, I need to make two separate url requests and merge the results.
When I checked, the function does call the API, fetch the results and merge them without any issue. However, for some reason, it only returns null to the client.
Below is the code: could someone take a look and advise me on where I went wrong?
exports.restaurantSearch = functions.https.onCall((data,context)=>{
const request = data.request;
const k = encodeURIComponent(request);
const url1 = "an_url_to_call_the_external_API"+k;
const url2 = "another_url_to_call_the_external_API"+k;
const url_array = [ url1, url2 ];
const result_array = [];
const info_array = [];
url_array.forEach(url=>{
return fetch(url, {headers: {"Authorization": "API_KEY"}})
.then(response=>{
return response.json()
})
.then(res=>{
result_array.push(res.documents);
if (result_array.length===2) {
const new_result_array_2 = [...new Set((result_array))];
new_result_array_2.forEach(nra=>{
info_array.push([nra.place_name,nra.address_name])
})
//info_array is not null at this point, but the below code only return null when checked from the client
return info_array;
}
})
.catch(error=>{
console.log(error)
return 'error';
})
})
});
Thanks a lot in advance!
You should use Promise.all() instead of running each promise (fetch request) separately in a forEach loop. Also I don't see the function returning anything if result_array.length is not 2. I can see there are only 2 requests that you are making but it's good to handle all possible cases so try adding a return statement if the condition is not satisfied. Try refactoring your code to this (I've used an async function):
exports.restaurantSearch = functions.https.onCall(async (data, context) => {
// Do note the async ^^^^^
const request = data.request;
const k = encodeURIComponent(request);
const url1 = "an_url_to_call_the_external_API" + k;
const url2 = "another_url_to_call_the_external_API" + k;
const url_array = [url1, url2];
const responses = await Promise.all(url_array.map((url) => fetch(url, { headers: { "Authorization": "API_KEY" } })))
const responses_array = await Promise.all(responses.map((response) => response.json()))
console.log(responses_array)
const result_array: any[] = responses_array.map((res) => res.documents)
// Although this if statement is redundant if you will be running exactly 2 promises
if (result_array.length === 2) {
const new_result_array_2 = [...new Set((result_array))];
const info_array = new_result_array_2.map(({place_name, address_name}) => ({place_name, address_name}))
return {data: info_array}
}
return {error: "Array length incorrect"}
});
If you'll be running 2 promises only, other option would be:
// Directly adding promises in Promise.all() instead of using map
const [res1, res2] = await Promise.all([fetch("url1"), fetch("url2")])
const [data1, data2] = await Promise.all([res1.json(), res2.json()])
Also check Fetch multiple links inside of forEach loop

Stop loop when getting a specific text response from server

I'm working with some API server that communicates by XML.
I need to send, let's say: 20 identical POST requests.
I'm writing this in Node JS.
Easy.
BUT - since I'm going to multiply the process, and I want to avoid flooding the server (and getting kicked), I need to break the sending loop IF the (XML) response contains a specific text (a success signal): <code>555</code>, or actually just '555' (the text is wrapped with other XML phrases).
I tried to break the loop based on the success signal AND also tried "exporting" it outside the loop (Thinking it could be nice to address it in the loop's condition).
Guess it's easy but being a newbie, I had to call for some help :)
Attaching the relevant code (simplified).
Many thanks !
const fetch = require("node-fetch");
const url = "https://www.apitest12345.com/API/";
const headers = {
"LOGIN": "abcd",
"PASSWD": "12345"
}
const data = '<xml></xml>'
let i = 0;
do { // the loop
fetch(url, { method: 'POST', headers: headers, body: data})
.then((res) => {
return res.text()
})
.then((text) => {
console.log(text);
if(text.indexOf('555') > 0) { // if the response includes '555' it means SUCCESS, and we can stop the loop
~STOP!~ //help me stop the loop :)
}
});
i += 1;
} while (i < 20);
Use simple for loop with async await.
const fetch = require("node-fetch");
const url = "https://www.apitest12345.com/API/";
const headers = {
"LOGIN": "abcd",
"PASSWD": "12345"
}
const data = '<xml></xml>'
for (let i = 0; i < 20; i++) {
const res = await fetch(url, { method: 'POST', headers: headers, body: data});
if (res.text().indexOf('555') !== -1)
break;
}

NodeJS Read file, make node-rest-client call to get related data and add retrieved response as element in JSON

I am reading JSON file using fs.readFileSync and for each document obtained, I am making a rest API call using client.post. Once I get response, I want to place the received content into another JSON file which is a replica of input JSON except additional element which is the data received from client.post call. However probably because of async nature of client.post, I am unable to add element to output JSON. I am new to NodeJS. Can you please help me where I am missing. Below is code and data
data:
[
{
"ticker": "CLYD"
},
{
"ticker": "EGH"
}
]
Code:
var fs = require('fs');
var Client = require('node-rest-client').Client;
var data = fs.readFileSync(__dirname + "/data/stocks.json", "utf8");
processData(data);
function processData (data) {
var obj = JSON.parse(data);
for (j = 0; j < obj.length; j++) {
obj[j].stockInformation = getValuesForTicker (obj[j].ticker.trim());
}
var jsonOutput = JSON.stringify(obj,null,'\t');
fs.writeFileSync(__dirname + "/data/response.json", jsonOutput);
};
function getValuesForTicker (ticker) {
/**
* More details and samples at https://www.npmjs.com/package/node-rest-client
*/
var client = new Client();
var values;
// set content-type header and data as json in args parameter
var args = {
data: { "ticker" : ticker},
headers: { "Content-Type": "application/json", "Accept" : "application/json" }
};
var responseToRequest = client.post("https://url.providing.response.as.json.content/", args, function (data, response) {
// parsed response body as js object
values = JSON.parse(JSON.stringify(data)).price;
});
return values;
};
Since getValueForTicker makes a async call to fetch data it should call a callback once data is recieved (or better a promise) and not return the result (currently undefined is returned as the value is returned before the value is assigned)
function getValuesForTicker (ticker, callback) {
/**
* More details and samples at https://www.npmjs.com/package/node-rest-client
*/
return new Promise(function(resolve, reject) {
var client = new Client();
var values;
// set content-type header and data as json in args parameter
var args = {
data: { "ticker" : ticker},
headers: { "Content-Type": "application/json", "Accept" : "application/json" }
};
var responseToRequest =
client.post("https://url.providing.response.as.json.content/", args, function (data, response) {
// parsed response body as js object
values = JSON.parse(JSON.stringify(data)).price;
resolve(values)
});
};
})
and to get the data once async call is done you will need to call then function as below:
getValuesForTicker(obj[j].ticker.trim())
.then(function(val) {
obj[j].stockInformation = val
})
Considering you are new to node.js it will be hard to get.Take some time to understand callback and promise first.

Resources