Im using NestJS, and this is my current implementation to make parallel http request:
#Injectable()
export class AppService {
constructor(private readonly http: HttpService) {}
private fetchData(index: number) {
Logger.log(`Collect index: ${index}`);
return this.http
.get<Passenger>(
`https://api.instantwebtools.net/v1/passenger?page=${index}&size=100`,
{ validateStatus: null },
)
.pipe(concatMap(response => of(response.data)));
}
async getAllData() {
let a = 0;
const collect: Passenger[] = [];
const $subject = new BehaviorSubject(a);
await $subject
.pipe(
flatMap(index =>
forkJoin([
this.fetchData(index),
this.fetchData(index + 1),
this.fetchData(index + 2),
]).pipe(mergeAll()),
),
tap(data => {
collect.push(data);
if (data?.data?.length === 0) {
$subject.complete(); // stop the stream
} else {
a += 3; // increment by 3, because the request is 3 times at a time
$subject.next(a);
}
}),
)
.toPromise();
return collect;
}
}
This service is to collect the 3rd party data. As for now, the fetchData() function is called multiple times according to how many parallel requests I want at a time. I use a dummy API for the test, but in the real scenario the API endpoint size limit is 100 and it doesn't return the meta information about how much the totalPage. It just returns the empty data when the last page is reached.
The goal is to make a parallel request and combine the result at the end. I'm doing this to keep the request time as minimum as possible and because the API itself has a rate limit of 50 requests per second. How to optimize this code?
To fetch all pages in one go you can use expand to recursively subscribe to an observable that fetches some pages. End the recursion by returning EMPTY when the last page you received is empty.
function fetchAllPages(batchSize: number = 3): Observable<any[][]> {
let index = 0;
return fetchPages(index, batchSize).pipe(
// if the last page isn't empty fetch the next pages, otherwise end the recursion
expand(pages => pages[pages.length - 1].length > 0
? fetchPages((index += batchSize), batchSize)
: EMPTY
),
// accumulate all pages in one array, filter out any trailing empty pages
reduce((acc, curr) => acc.concat(curr.filter(page => page.length)), [])
);
}
// fetch a given number of pages starting from 'index' as parallel requests
function fetchPages(index: number, numberOfPages: number): Observable<any[][]> {
const requests = Array.from({ length: numberOfPages }, (_, i) =>
fetchData(index + i)
);
return forkJoin(requests);
}
https://stackblitz.com/edit/rxjs-vkad5h?file=index.ts
This will obviously send a few unnecessary requests in the last batch if
(totalNumberOfPages + 1) % batchSize != 0.
Related
I have an array of users where each user has an IP address.
I have an API that I send an IP as a request and it returns a county code that belongs to this IP.
In order to get a country code to each user I need to send separate request to each user.
In my code I do async await but it takes about 10 seconds until I get all the responses, if I don't do the async await, I don’t get the country codes at all.
My code:
async function getAllusers() {
let allUsersData = await usersDao.getAllusers();
for (let i = 0; i < allUsersData.length; i++) {
let data = { ip: allUsersData[i].ip };
let body = new URLSearchParams(data);
await axios
.post("http://myAPI", body)
.then((res) => {
allUsersData[i].countryCode = res.data.countryCode;
});
}
return allUsersData;
}
You can use Promise.all to make all your requests once instead of making them one by one.
let requests = [];
for (let i = 0; i < allUsersData.length; i++) {
let data = { ip: allUsersData[i].ip };
let body = new URLSearchParams(data);
requests.push(axios.post("http://myAPI", body)); // axios.post returns a Promise
}
try {
const results = await Promise.all(requests);
// results now contains each request result in the same order
// Your logic here...
}
catch (e) {
// Handles errors
}
If you're just trying to get all the results faster, you can request them in parallel and know when they are all done with Promise.all():
async function getAllusers() {
let allUsersData = await usersDao.getAllusers();
await Promise.all(allUsersData.map((userData, index) => {
let body = new URLSearchParams({ip: userData.ip});
return axios.post("http://myAPI", body).then((res) => {
allUsersData[index].countryCode = res.data.countryCode;
});
}));
return allUsersData;
}
Note, I would not recommend doing it this way if the allUsersData array is large (like more than 20 long) because you'll be raining a lot of requests on the target server and it may either impeded its performance or you may get rate limited or even refused service. In that case, you'd need to send N requests at a time (like perhaps 5) using code like this pMap() here or mapConcurrent() here.
This question already has answers here:
How to paginate with Mongoose in Node.js?
(35 answers)
Closed 2 years ago.
How do I do pagination in NodeJS API. I have the code here that has the Post.Find().populate stuffs and I am not sure at which point of the code I can do pagination using limit and offset. And by the way, if I want to do an infinite scroll, should I use limit/offset or limit/skip? Any help greatly appreciated and many thanks in advance.
router.get('/allpost',JWTAuthenticatToken, async (req, res) => {
try{
const post = await Post.find().populate("postedby","_id displayname profileimage").populate("reviewers.postedby","_id displayname")
res.json(post)
}catch(error){
res.json({message:error})
}
})
For pagination you need values of two things
1.current_page_no
2.item_per_page
then you can code like that,
router.get('/allpost',JWTAuthenticatToken, async (req, res) => {
try{
const post = await Post.find().populate("postedby","_id displayname profileimage").populate("reviewers.postedby","_id displayname").skip((item_per_page * current_page_no)-item_per_page)).limit(item_per_page)
res.json(post)
}catch(error){
res.json({message:error})
}
})
I can share with you how my implementation works by using ExpressJS and MySQL database. The ideas behind are:
Determine the numberOfRows of the data that you wish to display form the database.
Set a pageSize to show the number of items to be displayed in single page.
Always return the page number for the current view using req.query, else by default, it will be page 1.
By using the returned page, we can calculate the number of rows of data to be skip.
Lastly, setup the limit(offset, length) in your query.
The coding part will be:
var connection;
var pageSize = 20;
var numberOfRows, numberOfPages;
var numberPerPage = parseInt(pageSize,10) || 1;
var page = parseInt(pageInfo.page, 10) || 1;
var skip = (page - 1) * numberPerPage;
var limit = `${skip} , ${numberPerPage}`;
return connectionPool.getConnection()
.then((connect) => {
connection = connect;
return connection.query(`SELECT COUNT(*) AS total ....`); //To get total rows
})
.then(([rows, field]) => {
numberOfRows = rows[0].total;
numberOfPages = Math.ceil(numberOfRows / numberPerPage);
return connection.query(`SELECT .... LIMIT ${limit}`); //Get specific data with LIMIT
})
.then(([rows, field]) => {
result = {
rows: rows,
pagination: {
current: page,
numberPerPage: numberPerPage,
has_previous: page > 1,
previous: page - 1,
has_next: page < numberOfPages,
next: page + 1,
last_page: Math.ceil(numberOfRows / pageSize)
}
}
return result;
})
.catch((err) => {
throw new Error(err.message);
})
.finally(() => {
connection.release();
})
It's a workable code, all you need is just properly organize your returned result correctly in your pagination at the frontend. Although it is done by using MySQL, but I hope you get the concept behind.
For pagination you will always need to send the current page. You can either send it as an query string, param or just POST variable.
router.get('/allpost',JWTAuthenticatToken, async (req, res) => {
try{
let page = req.body.page ?? 1;
let limit = 10;
const post = await Post.find().populate("postedby","_id displayname profileimage").populate("reviewers.postedby","_id displayname").skip((page * limit) - limit).limit(limit)
res.json(post)
}catch(error){
res.json({message:error})
}
})
I send it here with an POST.
What you do here you set an limit how much you maximum wanna display. Then you need some basic calculations.
On the first page 1 you wanna skip 0, on page 2 you wanna skip 10 on page 3 you wanna skip 20 and so on...
10 * 1 = skips 10. That means on page 1 you will skip 10. But you want to skip 0. For this you need to subtract by 10.
I am new to nodejs. I want to limit my external API call to 5 per minute. If I exceeds more than 5 API call per minute, I will get following error.
You have exceeded the maximum requests per minute.
This is my code. Here tickerSymbol array that is passed to scheduleTickerDetails function will be a large array with almost 100k elements in it.
public async scheduleTickerDetails(tickerSymbol: any) {
for(let i=0;i<tickerSymbol.length;i++) {
if(i%5 == 0){
await this.setTimeOutForSymbolAPICall(60000);}
await axios.get('https://api.polygon.io/v1/meta/symbols/' + tickerSymbol[i] + '/company?apiKey=' + process.env.POLYGON_API_KEY).then(async function (response: any) {
console.log("Logo : " + response.data.logo + 'TICKER :' + tickerSymbol[i]);
let logo = response.data.logo;
if (await stockTickers.updateOne({ ticker: tickerSymbol[i] }, { $set: { "logo": logo } }))
return true;
else
return false;
})
.catch(function (error: any) {
console.log("Error from symbol service file : " + error + 'symbol:'+tickerSymbol[i]);
});
}
}
/**
* Set time out for calling symbol API call
* #param minute
* #return Promise
*/
public setTimeOutForSymbolAPICall(minute:any) {
return new Promise( resolve => {
setTimeout(()=> {resolve('')} ,minute );
})
}
I want to send 1st 5 APIs first, then after a minute I need to send next 5 APIs and so on. I have created a setTimeOut fucntion for this, but sometimes in my console
Error: Request failed with status code 429 : You've exceeded the maximum requests per minute.
The for loop in JS runs immediately to completion while all your
asynchronous operations are started.
Refer this answer.
I'm using ionic 5.4.5. RestApi was created with node and express js.
When I want to search, the app just looking for datas on the first page. I want the app to search for all datas.How can I do this?
Our api paginated.
app.get("/units", (req, res) => {
let page = parseInt(req.query.page)
let limit = parseInt(req.query.limit)
if(!page && !limit){
page = 1;
limit = 5;
}
let startIndex = (page -1) * limit
let endIndex = page * limit
let results = {}
if(endIndex < data.length){
results.next = {
page : page + 1,
limit : limit
}
}
if(startIndex > 0){
results.previous = {
page : page - 1,
limit : limit
}
}
results.results = data.slice(startIndex,endIndex)
res.json(results);
});
app.get("/units/:id", (req, res) => {
const itemId = req.params.id;
const item = data.find(_item => _item.id == itemId);
if (item) {
res.json(item);
} else {
res.json({ message: `item ${itemId} doesn't exist`})
}
});
home.page.ts
getItems(ev) {
let val = ev.target.value;
if (val == '') {
this.loadUnit();
return;
}
if (val && val.trim() != '') {
this.unit = this.unit.filter((un,name) => {
return (un.name.toLocaleLowerCase().indexOf(val.toLocaleLowerCase()) > -1);
})
}
}
I also use a service to get data from the API.
Either return all data from the API and perform filtering and pagination on the app. Or perform filtering and pagination (in that order) in the API.
The last option is probably better if there are a lot of data rows, because:
Sending all rows to the app will use a lot of data and take a lot of time, especially over a WiFi or mobile data connection.
Processing on the server is usually faster than processing on the client, especially if it is a mobile device.
You might also want to return the total number of rows from the API.
I'm fairly new to nodejs and I am using puppeteer to automate some browsing , but I am getting a bit lost with the complexity of a certain scenario.
I am clicking a button , and it will search some records ( using ajax ) and put the result on the page.
Wait for Response / request doesn't really fit , because I am waiting for 2-3 requests depending on the type of search - and the response URL's are exactly the same for each. So , I guess I want to wait for 3 url responses of this particular URL to be complete.
Maybe I need to rethink this , or maybe I am close? The promise always times out even though it seems to be increasing the responseCount
async function intercepted(resp) {
if (resp.url().includes('/ajaxpro/')) {
return 1
}
return 0
}
let responseCount = 0
page.on('response', async resp => {
responseCount += await intercepted(resp)
})
const getResponse = await new Promise((resolve, reject) => {
setTimeout(() => resolve(responseCount > 3), 60000)
})
Try checking the condition after each response is received.
async function intercepted(resp) {
if (resp.url().includes('/ajaxpro/')) {
return 1
}
return 0
}
let responseCount = 0
page.on('response', async resp => {
let isTargetSearch = await intercepted(resp);
responseCount += isTargetSearch;
// - The current response is what we are looking for
// - and reached 3 times.
if(isTargetSearch && responseCount == 3) {
// Do what you need to do here
}
})
Maybe you don't have to wait until a certain number of responses have been intercepted but wait until the results from those AJAX calls have rendered something on the page (wait for visual results). In that case, you would be doing a page.waitForSelector(selector), where the selector is the information that the result of the call render on screen.
When dealing with Puppeteer, I usually find waiting for visible results to be better...