Async Https Request in Typescript - node.js

I am trying the whole day to get some https request running.
My code so far does not work, after calling it, i am running in an "Unhandled error RangeError: Maximum call stack size exceeded at Function.entries"
import * as https from "https";
function openRequest(options : any)
{
return new Promise((resolve, reject) => {
const request = https.request(options).on('response', (response : any) => {
resolve(response);
}).on('error', (error : any) => {
reject(error);
});
request.end();
});
}
I have to use a default library, so another one won't do the work.
Can someone tell me where i am dooing something wrong?

I've got this stuff to run, typescript isn't my native language and i really don't won't to make my living out of it but some stranger in the internet used to "await for" the res object in a loop:
const request = https.request(options, async (res : any) => {
res.setEncoding("utf-8");
let result = "";
for await (const chunk of res)
{
result += chunk;
}
resolve(result);
}).on("error", (error) => {
reject(error);
});
request.end();
I really don't know, if on("error"...) will work, because .on("response"...) has failed so far. But at least, in a good day at full moon, this code runs.

Related

apollo-server-micro body stream reading issue ( when using on firebase cloud functions )

apollo-server-micro package tries to receive request body stream that already received before(in some scenarios) and hangs on forever cause never receives events of stream(it already fully obtained)
I gone step by step through all the flow to discover the issue.
In brief the issue pops up when request stream that passed to Apollo already read before. It means we already had used for example: body.on('data', onData) & body.on('end', onEnd) or it was executed by another chain in the flow(express server, next.js server, firebase cloud function).
And if it was before what is going in apollo-server-micro code is that it tries to do it again, but it will never occur and we will fail on timeout or never get the response, because: body.on('data', or body.on('end' will never be called again(the stream already parsed before fully, these event will not happen).
So I think some way is needed to treat this situation and to give Apollo option to work with body stream already received. May be we need some way to say Apollo to not try to receive body stream if it already exists and just deliver it already prepared(buffer) by some property. So we don't need to do it if I already can provide it to the Apollo.
I found some hack I can do, but by far it needed to be done in more proper way.
Apollo uses json function from micro package (https://github.com/vercel/micro) to get the body stream. if I change this line there:
const body = rawBodyMap.get(req);
to something like:
const body = rawBodyMap.get(req) || req.rawBody;
I have rawBody because I use firebase cloud function and when it receives body stream it saves received stream buffer in rawBody property of request (and it's exactly what json function of micro tries to achieve).
full flow:
src/ApolloServer.ts ( from apollo-server-micro package )
import { graphqlMicro } from './microApollo';
const graphqlHandler = graphqlMicro(() => {
return this.createGraphQLServerOptions(req, res);
});
const responseData = await graphqlHandler(req, res);
send(res, 200, responseData);
microApollo.ts - we use here json function from 'micro' passing req as parameter
import { send, json, RequestHandler } from 'micro'; ( https://github.com/vercel/micro )
const graphqlHandler = async (req: MicroRequest, res: ServerResponse) => {
let query;
try {
query =
req.method === 'POST'
? req.filePayload || (await json(req))
: url.parse(req.url, true).query;
} catch (error) {
// Do nothing; `query` stays `undefined`
}
https://github.com/vercel/micro package
const getRawBody = require('raw-body');
exports.json = (req, opts) =>
exports.text(req, opts).then(body => parseJSON(body));
exports.text = (req, {limit, encoding} = {}) =>
exports.buffer(req, {limit, encoding}).then(body => body.toString(encoding));
exports.buffer = (req, {limit = '1mb', encoding} = {}) =>
Promise.resolve().then(() => {
const type = req.headers['content-type'] || 'text/plain';
const length = req.headers['content-length'];
// eslint-disable-next-line no-undefined
if (encoding === undefined) {
encoding = contentType.parse(type).parameters.charset;
}
// my try to hack the behavior
const body = rawBodyMap.get(req) || req.rawBody;
console.log(">>>>>>>>>>>>>>>>>>> ", body);
if (body) {
return body;
}
return getRawBody(req, {limit, length, encoding})
.then(buf => {
rawBodyMap.set(req, buf);
return buf;
})
.catch(err => {
if (err.type === 'entity.too.large') {
throw createError(413, `Body exceeded ${limit} limit`, err);
} else {
throw createError(400, 'Invalid body', err);
}
});
});
if I don't stop by my hack the code from going to receive the body
stream it calls to : getRawBody from 'raw-body' package;
raw-body package
function getRawBody (stream, options, callback) {
……
return new Promise(function executor (resolve, reject) {
readStream(stream, encoding, length, limit, function onRead (err, buf) {
if (err) return reject(err)
resolve(buf)
})
})
}
function readStream (stream, encoding, length, limit, callback) {
…….
// attach listeners
// these callbacks never called because body request stream already received before
stream.on('aborted', onAborted)
stream.on('close', cleanup)
stream.on('data', onData)
stream.on('end', onEnd)
stream.on('error', onEnd)
…….

Getting prefer-arrow-callback with Node https

I saw the the request library was depreciated, so I have been trying to switch to Node's https method instead. I pieced together this basic request function so far.
const https = require('https')
function httpRequest(options) {
return new Promise((resolve, reject) => {
const serverRequest = https.request(options, response => {
let body = ''
response.on('data', function (d) {
body += d
});
response.on('end', function () {
resolve(JSON.parse(body))
})
})
serverRequest.on('error', err => {
reject(err)
})
serverRequest.end()
})
}
It works, but causes eslint to throw prefer-arrow-callback. I don't fully understand why https uses the .on syntax in the first place, so I'm wondering if this function can be re-written in a way that gets rid of the warning and is more in line with modern JavaScript.
I believe that that error means to say it would prefer a Lambda function definition. If you are new to lambda functions, they are formatted as such:
(parameters) => {
}
Try re-writing your code like this:
response.on('data', (d) => {
body += d;
});
response.on('end', () => {
resolve(JSON.parse(body));
});
As for the use of .on, its just how Node formats event listeners.

How to write native Nodejs async https request code

I have copied the very good code from https://www.tomas-dvorak.cz/posts/nodejs-request-without-dependencies/ to make a http request in nodejs using native modules.
I want to be able to use the data value later on in the script.
I know this is a common issue with newbies and async code, i just CANNOT understand this yet and have struggled for weeks to get it.
I have coped much code, watched youtube, talked to people, its flippen hard..
const getContent = function(url) {
return new Promise((resolve, reject) => {
const https = require('https')
const request = https.get(url, (response) => {
// handle http errors
if (response.statusCode < 200 || response.statusCode > 299) {
reject(new Error('Failed to load page, status code: ' + response.statusCode));
}
// temporary data holder
const body = [];
// on every content chunk, push it to the data array
response.on('data', (chunk) => body.push(chunk));
// we are done, resolve promise with those joined chunks
response.on('end', () => resolve(body.join('')));
});
// handle connection errors of the request
request.on('error', (err) => reject(err))
})
}
getContent('https://myapi/json')
.then((data) => console.log(data))
.catch((err) => console.error(err))
// I want to use the "data" value down here in my script. I want to do things with the "data" value like JSON.parse(data)
console.log(data) //undefined
let json = JSON.parse(data) //undefined
console.log('after')
my result for data is undefined
How can i use data down here below all the code above?
You can setup a callback and access your data within this callback, this pattern should be easy enough to use.
getContent('https://myapi/json')
.then(useData)
.catch((err) => console.error(err))
// Use this callback to do what you want with your data!
function useData(data) {
console.log(data);
let json = JSON.parse(data);
}
Or using async / await ( this might be more intuitive!):
async function testAwait() {
let data = await getContent('https://myapi/json');
console.log("data: ", data);
}
testAwait();

How can i set a timeout on google cloud datastore .get?

I'm just start with some google cloud services, and I'm trying to get a entity from datastore.
If the client have internet connection, everything its going well.
But i want to put a try catch statement for the cases were the client have no access to datastore, due any reason (like internet).
Here's my code:
try{
let search = datastore.key(['Client', Client_id])
datastore.get(search, /*{timeout: 1000},*/ function (err, entity) {
console.log('limit >>>', entity.limit)
evt.emit('comparedate', res, entity.limit)
});
}
catch(error){
console.log('Error >>>', error)
}
My problem is: there is no time limit for connection attempt. When the client have no access to the internet the request keep "pending" forever, and don't go to the catch condition.
I tried some parameters like: Global#CallOptions, but with no success.
Thanks for any help!
EDIT >>>> I know that's not the most trustworthy way. But for now I resolved with this code:
evt.on('isonline', (res) => {
try{
require('dns').lookup('google.com',function(err) {
if (err && err.code == "ENOTFOUND") {
console.log('NO INTERNET')
evt.emit('readofflinedata', res)
} else {
console.log('WITH INTERNET')
evt.emit('readonlinedata', res)
}
})
}
catch(error){
res.status(200).send({ error: true, message: error.message })
}
})
The Datastore client uses a library internally called google-gax. You can configure timeouts/etc. by passing in gax options.
datastore.get(key, {
gaxOptions: {timeout: 1000}
}, (err, entity) => {
// ...
});
I didn't found any parameter to add a timeout in the get function of datastore. However you can use a Promise and set a timer, if the execution of the function takes too long it will stop it.
var Promise = require("bluebird");
var elt = new Promise((resolve, reject) => {
fun(param, (err) => {
if (err) reject(err);
doSomething(); // <- datastore.get() funtion
resolve();
});
elt.timeout(1000).then(() => console.log('done'))
.catch(Promise.TimeoutError, (e) => console.log("timed out"))

Axios.all, how to configure axios wait time to mitigate hung up?

My application uses an internal webservice for fetching data, i have a job which creates approx 500 requests which getsfired async to complete the fetch operation.
I make use of Axios, by creating an array of axios promises and then resolving them using using Axios.all();
It works fine until some 200 requests but post that i get socket hung up, however on the server side i see the requests are being processed.
How to configure axios to set custom time out, or is it a better idea to splice my promises array and then run them as multiple batches ?
Source code
let getAxiosPromiseArray = (urlList) => {
var axiosArrayofPromise = [];
return new Promise ( (resolve, reject) => {
try {
urlList.forEach ( (URL) => {
axiosArrayofPromise.push(axios.get(URL));
});
resolve(axiosArrayofPromise);
}
catch (err) {
reject("There is a problem getting Axios array of promises " + err);
}
})
}
async function processAxiosPromises (PromiseArray) {
try {
var results = []
results = await axios.all(PromiseArray);
return results;
}
catch(err) {
throw("There was a problem resolving promises array (Axios) " + err);
}
}
getallID().then ( (urlList) => {
return getAxiosPromiseArray(urlList);
}).then( (AxiosPromises) => {
return processAxiosPromises(AxiosPromises);
}).then ((resultData) => {
console.log(resultData);
});
Error
There was a problem resolving promises array (Axios) Error: socket hang up
First, that pair of functions getAxiosPromiseArray() and processAxiosPromises() needs fixing.
Your new Promise() construction is unnecessary. You can simply return Promise.all(arrayofPromise) (or axios.all(...) if you must) and do away with the other function.
Renaming the remaining function to something meaningful, you would end up with eg :
let getData = (urlList) => {
return Promise.all(urlList.map(URL => axios.get(URL)))
.catch(error => {
error.message = "There is a problem getting Axios array of promises " + error.message; // augment the error message ...
throw error; // ... and re-throw the errror.
});
};
And call as follows :
getallID().then(getData)
.then(resultData => {
console.log(resultData);
}).catch(error => {
console.error(error);
});
That will put you on solid ground but, on its own, is unlikely to fix a concurrency problem (if that's what it is), for which the simplest approach is to use Bluebird's Promise.map with the concurrency option.
The caller code can remain the same, just change getData(), as follows:
let getData = (urlList) => {
let concurrency = 10; // play with this value to find a reliable concurrency limit
return Promise.map(urlList, URL => axios.get(URL), {'concurrency': concurrency})
.catch(error => {
error.message = "There is a problem getting Axios array of promises " + error.message;
throw error;
});
};
// where `Promise` is Bluebird.
const axios = require('axios');
const axiosThrottle = require('axios-throttle');
//pass axios object and value of the delay between requests in ms
axiosThrottle.init(axios,200)
const options = {
method: 'GET',
};
const urlList = [
'https://jsonplaceholder.typicode.com/todos/1',
'https://jsonplaceholder.typicode.com/todos/2',
'https://jsonplaceholder.typicode.com/todos/3',
'https://jsonplaceholder.typicode.com/todos/4',
'https://jsonplaceholder.typicode.com/todos/5',
'https://jsonplaceholder.typicode.com/todos/6',
'https://jsonplaceholder.typicode.com/todos/7',
'https://jsonplaceholder.typicode.com/todos/8',
'https://jsonplaceholder.typicode.com/todos/9',
'https://jsonplaceholder.typicode.com/todos/10'
];
const promises = [];
const responseInterceptor = response => {
console.log(response.data);
return response;
};
//add interceptor to work with each response seperately when it is resolved
axios.interceptors.response.use(responseInterceptor, error => {
return Promise.reject(error);
});
for (let index = 0; index < urlList.length; index++) {
options.url = urlList[index];
promises.push(axiosThrottle.getRequestPromise(options, index));
}
//run when all promises are resolved
axios.all(promises).then(responses => {
console.log(responses.length);
});
https://github.com/arekgotfryd/axios-throttle

Resources