This question already has answers here:
How to paginate with Mongoose in Node.js?
(35 answers)
Closed 2 years ago.
How do I do pagination in NodeJS API. I have the code here that has the Post.Find().populate stuffs and I am not sure at which point of the code I can do pagination using limit and offset. And by the way, if I want to do an infinite scroll, should I use limit/offset or limit/skip? Any help greatly appreciated and many thanks in advance.
router.get('/allpost',JWTAuthenticatToken, async (req, res) => {
try{
const post = await Post.find().populate("postedby","_id displayname profileimage").populate("reviewers.postedby","_id displayname")
res.json(post)
}catch(error){
res.json({message:error})
}
})
For pagination you need values of two things
1.current_page_no
2.item_per_page
then you can code like that,
router.get('/allpost',JWTAuthenticatToken, async (req, res) => {
try{
const post = await Post.find().populate("postedby","_id displayname profileimage").populate("reviewers.postedby","_id displayname").skip((item_per_page * current_page_no)-item_per_page)).limit(item_per_page)
res.json(post)
}catch(error){
res.json({message:error})
}
})
I can share with you how my implementation works by using ExpressJS and MySQL database. The ideas behind are:
Determine the numberOfRows of the data that you wish to display form the database.
Set a pageSize to show the number of items to be displayed in single page.
Always return the page number for the current view using req.query, else by default, it will be page 1.
By using the returned page, we can calculate the number of rows of data to be skip.
Lastly, setup the limit(offset, length) in your query.
The coding part will be:
var connection;
var pageSize = 20;
var numberOfRows, numberOfPages;
var numberPerPage = parseInt(pageSize,10) || 1;
var page = parseInt(pageInfo.page, 10) || 1;
var skip = (page - 1) * numberPerPage;
var limit = `${skip} , ${numberPerPage}`;
return connectionPool.getConnection()
.then((connect) => {
connection = connect;
return connection.query(`SELECT COUNT(*) AS total ....`); //To get total rows
})
.then(([rows, field]) => {
numberOfRows = rows[0].total;
numberOfPages = Math.ceil(numberOfRows / numberPerPage);
return connection.query(`SELECT .... LIMIT ${limit}`); //Get specific data with LIMIT
})
.then(([rows, field]) => {
result = {
rows: rows,
pagination: {
current: page,
numberPerPage: numberPerPage,
has_previous: page > 1,
previous: page - 1,
has_next: page < numberOfPages,
next: page + 1,
last_page: Math.ceil(numberOfRows / pageSize)
}
}
return result;
})
.catch((err) => {
throw new Error(err.message);
})
.finally(() => {
connection.release();
})
It's a workable code, all you need is just properly organize your returned result correctly in your pagination at the frontend. Although it is done by using MySQL, but I hope you get the concept behind.
For pagination you will always need to send the current page. You can either send it as an query string, param or just POST variable.
router.get('/allpost',JWTAuthenticatToken, async (req, res) => {
try{
let page = req.body.page ?? 1;
let limit = 10;
const post = await Post.find().populate("postedby","_id displayname profileimage").populate("reviewers.postedby","_id displayname").skip((page * limit) - limit).limit(limit)
res.json(post)
}catch(error){
res.json({message:error})
}
})
I send it here with an POST.
What you do here you set an limit how much you maximum wanna display. Then you need some basic calculations.
On the first page 1 you wanna skip 0, on page 2 you wanna skip 10 on page 3 you wanna skip 20 and so on...
10 * 1 = skips 10. That means on page 1 you will skip 10. But you want to skip 0. For this you need to subtract by 10.
Related
I am working on a project that deals with a large amount of data. Fetching of all data at once from mongodb is not an option since it results in bad user experience. I am working on creating an infinite loading setup and with each scroll, I want a fix number of data that is fetched from mongodb and will concatenate the newly fetched data with the previously fetched data to show results on my webpage.
How to do pagination in mongodb using nodejs?
The mongodb node.js driver allows you to use the pagination through the limit and the skip attributes.
// You can first start by counting the number of entities in your collection
collection.countDocuments().then((count) => {
var step = 1000;
var offset = 0;
var limit = step;
//then exploiting your offset and limit variables
//you can limit the number of results you get in each query (a page of results)
while (offset < count) {
process(offset, limit);
offset += step;
}
});
})
.catch((error) => console.error(error));
async function process(offset, limit) {
var entities = collection.find(
{},
{
limit: limit,
skip: offset,
}
);
for await (const entity of entities) {
// do what you want
}
}
You can find more details on the MongoDB documentation page.
https://www.mongodb.com/docs/drivers/node/current/fundamentals/crud/read-operations/limit/
so I have a collection where I will have a lot of documents, but for now lets suppose it has about 100. (It had less than that when I was testing).
So, I need to get all the documents of the collection, put in a array, sort that and then send it to the websocket for the frontend. But the array is going with wrong data.
This is my code:
const emitSales = async (socket) => {
let salesArray = [];
const saleExists = (contract) => {
return salesArray.some(element => element.contract === contract);
}
const addSale = (contract) => {
const element = salesArray.find(e => e.contract === contract);
element.sales = element.sales+1;
}
const sales = await fiveMinSchema.find({}).lean();
if(sales) {
for await (x of sales) {
if(saleExists(x.contract)) {
addSale(x.contract);
continue;
}
const collection = await Collection.findOne({
contract: x.contract
});
let newsale = {
contract: x.contract,
title: collection.title,
description: collection.description,
image: collection.image,
sales: 1,
}
salesArray.push(newsale);
}
socket.emit("5min", salesArray.sort((a,b) => {
return b.sales-a.sales;
}).slice(0,10));
}
}
So, when I execute this function only once, for example, the array returns the correct values. But if I execute the function like 2 times in a row (like very fast), it starts returning the array with wrong data. (like mixing the data).
And as I using websocket, this function will execute like every 2 seconds (for example). How can I fix this problem? Like it seems to be executing more than one time simultaneously and mixing the data, idk..
I have a large number of URLs within a xlsx file. What I'd like to do is randomly select some of these URLs, load them, then check that they return a status code of 200.
So I'm using the npm alasql package to do this.
At the moment, the following code successfully loads the first 5 URLs in the spreadsheet, checks that they 200, then finishes the test.
var alasql = require('alasql');
var axios = require('axios');
module.exports = {
'#tags': ['smokeTest'],
'Site map XML pages load': async (browser) => {
const result = await alasql.promise('select URL from xlsx("./testUrls.xlsx",{sheetid:"RandomUrls"})');
var xcelData = result.map(item => {
return item.URL;
});
async function siteMapUrlsTestArr(item) {
var response = await axios.get(browser.launch_url + item);
browser.verify.equal(response.status, 200);
console.log('Sitemap test URL =', (browser.launch_url + item));
}
for (let index = 0; index < xcelData.length; index++) {
if (index < 5) {
const xmlTestUrl = xcelData[index];
await siteMapUrlsTestArr(xmlTestUrl);
} else {}
}
},
'Closing the browser': function (browser) {
browser.browserEnd();
},
};
However, what I'd like to do is randomly select 5 URLs from the (large) list of URLs in the spreadsheet, rather than the first 5 URLs.
I appreciate that this will (probably) include using the Math.floor(Math.random() command, but I can't seem to get it to work no matter where I place this command.
Any help would be greatly appreciated. Thanks.
Your logic is flawed. Here's how.
You want to select 5 random URLs from a list and then, perform the operation on the items but what you're doing is you're getting all the items and running the operation using a loop on first five.
To correct it:
//Fixed to five as you only want to test 5 URLs.
for (let index = 0; index < 5; index++) {
//Selecting a Random item from the list using Math.random();
const xmlTestUrl = xcelData[Math.floor(Math.random() * xcelData.length)];
//Performing the HTTP response operation on it.
await siteMapUrlsTestArr(xmlTestUrl);
}
The aforementioned solution will select a random item in each loop and perform the HTTP response operation on it. The items will be randomly selected using Math.random().
I need some advice on how to structure this function as at the moment it is not happening in the correct order due to node being asynchronous.
This is the flow I want to achieve; I don't need help with the code itself but with the order to achieve the end results and any suggestions on how to make it efficient
Node routes a GET request to my controller.
Controller reads a .csv file on local system and opens a read stream using fs module
Then use csv-parse module to convert that to an array line by line (many 100,000's of lines)
Start a try/catch block
With the current row from the csv, take a value and try to find it in a MongoDB
If found, take the ID and store the line from the CSV and this id as a foreign ID in a separate database
If not found, create an entry into the DB and take the new ID and then do 6.
Print out to terminal the row number being worked on (ideally at some point I would like to be able to send this value to the page and have it update like a progress bar as the rows are completed)
Here is a small part of the code structure that I am currently using;
const fs = require('fs');
const parse = require('csv-parse');
function addDataOne(req, id) {
const modelOneInstance = new InstanceOne({ ...code });
const resultOne = modelOneInstance.save();
return resultOne;
}
function addDataTwo(req, id) {
const modelTwoInstance = new InstanceTwo({ ...code });
const resultTwo = modelTwoInstance.save();
return resultTwo;
}
exports.add_data = (req, res) => {
const fileSys = 'public/data/';
const parsedData = [];
let i = 0;
fs.createReadStream(`${fileSys}${req.query.file}`)
.pipe(parse({}))
.on('data', (dataRow) => {
let RowObj = {
one: dataRow[0],
two: dataRow[1],
three: dataRow[2],
etc,
etc
};
try {
ModelOne.find(
{ propertyone: RowObj.one, propertytwo: RowObj.two },
'_id, foreign_id'
).exec((err, searchProp) => {
if (err) {
console.log(err);
} else {
if (searchProp.length > 1) {
console.log('too many returned from find function');
}
if (searchProp.length === 1) {
addDataOne(RowObj, searchProp[0]).then((result) => {
searchProp[0].foreign_id.push(result._id);
searchProp[0].save();
});
}
if (searchProp.length === 0) {
let resultAddProp = null;
addDataTwo(RowObj).then((result) => {
resultAddProp = result;
addDataOne(req, resultAddProp._id).then((result) => {
resultAddProp.foreign_id.push(result._id);
resultAddProp.save();
});
});
}
}
});
} catch (error) {
console.log(error);
}
i++;
let iString = i.toString();
process.stdout.clearLine();
process.stdout.cursorTo(0);
process.stdout.write(iString);
})
.on('end', () => {
res.send('added');
});
};
I have tried to make the functions use async/await but it seems to conflict with the fs.openReadStream or csv parse functionality, probably due to my inexperience and lack of correct use of code...
I appreciate that this is a long question about the fundamentals of the code but just some tips/advice/pointers on how to get this going would be appreciated. I had it working when the data was sent one at a time via a post request from postman but can't implement the next stage which is to read from the csv file which contains many records
First of all you can make the following checks into one query:
if (searchProp.length === 1) {
if (searchProp.length === 0) {
Use upsert option in mongodb findOneAndUpdate query to update or upsert.
Secondly don't do this in main thread. Use a queue mechanism it will be much more efficient.
Queue which I personally use is Bull Queue.
https://github.com/OptimalBits/bull#basic-usage
This also provides the functionality you need of showing progress.
Also regarding using Async Await with ReadStream, a lot of example can be found on net such as : https://humanwhocodes.com/snippets/2019/05/nodejs-read-stream-promise/
I'm using ionic 5.4.5. RestApi was created with node and express js.
When I want to search, the app just looking for datas on the first page. I want the app to search for all datas.How can I do this?
Our api paginated.
app.get("/units", (req, res) => {
let page = parseInt(req.query.page)
let limit = parseInt(req.query.limit)
if(!page && !limit){
page = 1;
limit = 5;
}
let startIndex = (page -1) * limit
let endIndex = page * limit
let results = {}
if(endIndex < data.length){
results.next = {
page : page + 1,
limit : limit
}
}
if(startIndex > 0){
results.previous = {
page : page - 1,
limit : limit
}
}
results.results = data.slice(startIndex,endIndex)
res.json(results);
});
app.get("/units/:id", (req, res) => {
const itemId = req.params.id;
const item = data.find(_item => _item.id == itemId);
if (item) {
res.json(item);
} else {
res.json({ message: `item ${itemId} doesn't exist`})
}
});
home.page.ts
getItems(ev) {
let val = ev.target.value;
if (val == '') {
this.loadUnit();
return;
}
if (val && val.trim() != '') {
this.unit = this.unit.filter((un,name) => {
return (un.name.toLocaleLowerCase().indexOf(val.toLocaleLowerCase()) > -1);
})
}
}
I also use a service to get data from the API.
Either return all data from the API and perform filtering and pagination on the app. Or perform filtering and pagination (in that order) in the API.
The last option is probably better if there are a lot of data rows, because:
Sending all rows to the app will use a lot of data and take a lot of time, especially over a WiFi or mobile data connection.
Processing on the server is usually faster than processing on the client, especially if it is a mobile device.
You might also want to return the total number of rows from the API.