Pagination in DynamoDB using Node.js? - node.js

I've had a read through AWS's docs around pagination:
As their docs specify:
In a response, DynamoDB returns all the matching results within the scope of the Limit value. For example, if you issue a Query or a Scan request with a Limit value of 6 and without a filter expression, DynamoDB returns the first six items in the table that match the specified key conditions in the request (or just the first six items in the case of a Scan with no filter)
Which means that given I have a table called Questions with an attribute called difficulty(that can take any numeric value ranging from 0 to 2) I might end up with the following conundrum:
A client makes a request, think GET /questions?difficulty=0&limit=3
I forward that 3 to the DynamoDB query, which might return 0 items as the first 3 in the collection might not be of difficulty == 0
I then have to perform a new query to fetch more questions that match that criteria without knowing I might return duplicates
How can I then paginate based on a query correctly? Something where I'll get as many results as I asked for whilst having the correct offset

Using async/await.
const getAllData = async (params) => {
console.log("Querying Table");
let data = await docClient.query(params).promise();
if(data['Items'].length > 0) {
allData = [...allData, ...data['Items']];
}
if (data.LastEvaluatedKey) {
params.ExclusiveStartKey = data.LastEvaluatedKey;
return await getAllData(params);
} else {
return data;
}
}
I am using a global variable allData to collect all the data.
Calling this function is enclosed within a try-catch
try {
await getAllData(params);
console.log("Processing Completed");
// console.log(allData);
} catch(error) {
console.log(error);
}
I am using this from within a Lambda and it works fine.
The article here really helped and guided me. Thanks.

Here is an example of how to iterate over a paginated result set from
a DynamoDB scan (can be easily adapted for query as well) in Node.js.
You could save the LastEvaluatedKey state serverside and pass an identifier back to your client, which it would send with its next request and your server would pass that value as ExclusiveStartKey in the next request to DynamoDB.
const AWS = require('aws-sdk');
AWS.config.logger = console;
const dynamodb = new AWS.DynamoDB({ apiVersion: '2012-08-10' });
let val = 'some value';
let params = {
TableName: "MyTable",
ExpressionAttributeValues: {
':val': {
S: val,
},
},
Limit: 1000,
FilterExpression: 'MyAttribute = :val',
// ExclusiveStartKey: thisUsersScans[someRequestParamScanID]
};
dynamodb.scan(scanParams, function scanUntilDone(err, data) {
if (err) {
console.log(err, err.stack);
} else {
// do something with data
if (data.LastEvaluatedKey) {
params.ExclusiveStartKey = data.LastEvaluatedKey;
dynamodb.scan(params, scanUntilDone);
} else {
// all results scanned. done!
someCallback();
}
}
});

Avoid using recursion to prevent call stack overflow. An iterative solution extending #Roshan Khandelwal's approach:
const getAllData = async (params) => {
const _getAllData = async (params, startKey) => {
if (startKey) {
params.ExclusiveStartKey = startKey
}
return this.documentClient.query(params).promise()
}
let lastEvaluatedKey = null
let rows = []
do {
const result = await _getAllData(params, lastEvaluatedKey)
rows = rows.concat(result.Items)
lastEvaluatedKey = result.LastEvaluatedKey
} while (lastEvaluatedKey)
return rows
}

I hope you figured out. So just in case others might find it useful. AWS has QueryPaginator/ScanPaginator as simple as below:
const paginator = new QueryPaginator(dynamoDb, queryInput);
for await (const page of paginator) {
// do something with the first page of results
break
}
See more details at https://github.com/awslabs/dynamodb-data-mapper-js/tree/master/packages/dynamodb-query-iterator
2022-05-19:
For AWS SDK v3 see how to use paginateXXXX at this blog post https://aws.amazon.com/blogs/developer/pagination-using-async-iterators-in-modular-aws-sdk-for-javascript/

Query and Scan operations return LastEvaluatedKey in their responses. Absent concurrent insertions, you will not miss items nor will you encounter items multiple times, as long as you iterate calls to Query/Scan and set ExclusiveStartKey to the LastEvaluatedKey of the previous call.

For create pagination in dynamodb scan like
var params = {
"TableName" : "abcd",
"FilterExpression" : "#someexperssion=:someexperssion",
"ExpressionAttributeNames" : {"#someexperssion":"someexperssion"},
"ExpressionAttributeValues" : {":someexperssion" : "value"},
"Limit" : 20,
"ExclusiveStartKey" : {"id": "9ee10f6e-ce6d-4820-9fcd-cabb0d93e8da"}
};
DB.scan(params).promise();
where ExclusiveStartKey is LastEvaluatedKey return by this query last execution time

Using async/await, returning the data in await.
Elaboration on #Roshan Khandelwal's answer.
const getAllData = async (params, allData = []) => {
const data = await dynamodbDocClient.scan(params).promise()
if (data['Items'].length > 0) {
allData = [...allData, ...data['Items']]
}
if (data.LastEvaluatedKey) {
params.ExclusiveStartKey = data.LastEvaluatedKey
return await getAllData(params, allData)
} else {
return allData
}
}
Call inside a try/catch:
try {
const data = await getAllData(params);
console.log("my data: ", data);
} catch(error) {
console.log(error);
}

you can do a index secundary by difficulty and at query set KeyConditionExpression where difficulty = 0. Like this
var params = {
TableName: questions,
IndexName: 'difficulty-index',
KeyConditionExpression: 'difficulty = :difficulty ',
ExpressionAttributeValues: {':difficulty':0}
}

You can also achieve this using recrusion instead of a global variable, like:
const getAllData = async (params, allData = []) => {
let data = await db.scan(params).promise();
return (data.LastEvaluatedKey) ?
getAllData({...params, ExclusiveStartKey: data.LastEvaluatedKey}, [...allData, ...data['Items']]) :
[...allData, ...data['Items']];
};
Then you can simply call it like:
let test = await getAllData({ "TableName": "test-table"}); // feel free to add try/catch

Using DynamoDB pagination with async generators:
let items = []
let params = {
TableName: 'mytable',
Limit: 1000,
KeyConditionExpression: 'mykey = :key',
ExpressionAttributeValues: {
':key': { S: 'myvalue' },
},
}
async function* fetchData({
params
}) {
let data
do {
data = await dynamodb.query(params).promise()
yield data.Items
params.ExclusiveStartKey = data.LastEvaluatedKey
} while (typeof data.LastEvaluatedKey != 'undefined')
}
for await (const data of fetchData(params)) {
items = [...items, ...data]
}

Related

Request Body Read Multiple Value on Node Js

i am new in NodeJs development,
i want to ask, surely this is kinda basic, but i dont know how to do it.
i have a task to read request one field that can filled with multiple values,
on json array like this :
{
"email" : "first#mail.com" , "second#mail.com", "third#mail.com"
}
how to get each value from that "email" field and processing it to some function ?
i have a query select to get username from one table
select username from [dbo].[users] where email=#email (first, second, and third)
this is my code for now only that read one value, not multiple :
async getValueFunction(req, res) {
res.setHeader('Content-Type', 'application/json');
try {
if (req.body.email != "") {
const pool = await poolPromise
const result = await pool.request()
.input('email', sql.VarChar, req.body.email)
.query(queries.getUserScoreByEmail)
var showUserScore = result.recordset;
res.json(showUserScore);
} else if (req.body.email == "") {
const pool = await poolPromise
const result = await pool.request()
.query(queries.getUserScore)
var showAllUserScore = result.recordset;
res.json(showAllUserScore);
}
} catch (error) {
res.status(500)
res.send(error.message)
}
}
how to do the loop (iteration) and collect the result/recordset before send as one response (json) ??
You should update your structure because it is not an key value pair.
What you can do is storing the E-Mail Adresses in an Array like this
const data ={
"email" : ["first#mail.com" , "second#mail.com", "third#mail.com" ]
}
And then you access it with data.email

How do I run an update while streaming using pg-query-stream and pg-promise?

I am trying to load 50000 items from the database with text in them, tag them and update the tags
I am using pg-promise and pg-query-stream for this purpoes
I was able to get the streaming part working properly but updating has become problematic with so many update statements
Here is my existing code
const QueryStream = require('pg-query-stream')
const JSONStream = require('JSONStream')
function prepareText(title, content, summary) {
let description
if (content && content.length) {
description = content
} else if (summary && summary.length) {
description = summary
} else {
description = ''
}
return title.toLowerCase() + ' ' + description.toLowerCase()
}
async function tagAll({ db, logger, tagger }) {
// you can also use pgp.as.format(query, values, options)
// to format queries properly, via pg-promise;
const qs = new QueryStream(
'SELECT feed_item_id,title,summary,content FROM feed_items ORDER BY pubdate DESC, feed_item_id DESC'
)
try {
const result = await db.stream(qs, (s) => {
// initiate streaming into the console:
s.pipe(JSONStream.stringify())
s.on('data', async (item) => {
try {
s.pause()
// eslint-disable-next-line camelcase
const { feed_item_id, title, summary, content } = item
// Process text to be tagged
const text = prepareText(title, summary, content)
const tags = tagger.tag(text)
// Update tags per post
await db.query(
'UPDATE feed_items SET tags=$1 WHERE feed_item_id=$2',
// eslint-disable-next-line camelcase
[tags, feed_item_id]
)
} catch (error) {
logger.error(error)
} finally {
s.resume()
}
})
})
logger.info(
'Total rows processed:',
result.processed,
'Duration in milliseconds:',
result.duration
)
} catch (error) {
logger.error(error)
}
}
module.exports = tagAll
The db object is the one from pg-promise whereas the tagger simply extracts an array of tags from text contained in the variable tags
Too many update statements are executing from what I can see in the diagnostics, is there a way to batch them?
If you can do everything with one sql statement, you should! Here you're paying the price of a back and forth between node and your DB for each line of your table, which will take most of the time of your query.
Your request can be implemented in pure sql:
update feed_items set tags=case
when (content = '') is false then lower(title) || ' ' || lower(content)
when (summary = '') is false then lower(title) || ' ' || lower(summary)
else title end;
This request will update all your table at once. I'm sure it'd be some order of magnitude faster than your method. On my machine, with a table containing 100000 rows, the update time is about 600ms.
Some remarks:
you don't need to order to update. As ordering is quite slow, it's better not to.
I guess the limit part was because it is too slow? If it is the case, then you can drop it, 50000 rows is not a big table for postgres.
I bet this pg-stream things does not really stream stuff out of the DB, it only allows you to use a stream-like api from the results it gathered earlier... No problem about that, but I thought maybe there was a misconception here.
This is the best I could come up with to batch the queries inside the stream so that we dont need to load all data in memory or run too many queries. If anyone knows a better way to batch especially with t.sequence feel free to add another answer
const BATCH_SIZE = 5000
async function batchInsert({ db, pgp, logger, data }) {
try {
// https://vitaly-t.github.io/pg-promise/helpers.ColumnSet.html
const cs = new pgp.helpers.ColumnSet(
[
{ name: 'feed_item_id', cast: 'uuid' },
{ name: 'tags', cast: 'varchar(64)[]' },
],
{
table: 'feed_items',
}
)
const query =
pgp.helpers.update(data, cs) + ' WHERE v.feed_item_id=t.feed_item_id'
await db.none(query)
} catch (error) {
logger.error(error)
}
}
async function tagAll({ db, pgp, logger, tagger }) {
// you can also use pgp.as.format(query, values, options)
// to format queries properly, via pg-promise;
const qs = new QueryStream(
'SELECT feed_item_id,title,summary,content FROM feed_items ORDER BY pubdate DESC, feed_item_id DESC'
)
try {
const queryValues = []
const result = await db.stream(qs, (s) => {
// initiate streaming into the console:
s.pipe(JSONStream.stringify())
s.on('data', async (item) => {
try {
s.pause()
// eslint-disable-next-line camelcase
const { feed_item_id, title, summary, content } = item
// Process text to be tagged
const text = prepareText(title, summary, content)
const tags = tagger.tag(text)
queryValues.push({ feed_item_id, tags })
if (queryValues.length >= BATCH_SIZE) {
const data = queryValues.splice(0, queryValues.length)
await batchInsert({ db, pgp, logger, data })
}
} catch (error) {
logger.error(error)
} finally {
s.resume()
}
})
})
await batchInsert({ db, pgp, logger, data: queryValues })
return result
} catch (error) {
logger.error(error)
}
}

How to make Mongoose update work with await?

I'm creating a NodeJS backend where a process reads in data from a source, checks for changes compared to the current data, makes those updates to MongoDB and reports the changes made. Everything works, except I can't get the changes reported, because I can't get the Mongoose update action to await.
The returned array from this function is then displayed by a Koa server. It shows an empty array, and in the server logs, the correct values appear after the server has returned the empty response.
I've digged through Mongoose docs and Stack Overflow questions – quite a few questions about the topic – but with no success. None of the solutions provided seem to help. I've isolated the issue to this part: if I remove the Mongoose part, everything works as expected.
const parseJSON = async xmlData => {
const changes = []
const games = await Game.find({})
const gameObjects = games.map(game => {
return new GameObject(game.name, game.id, game)
})
let jsonObj = require("../sample.json")
Object.keys(jsonObj.items.item).forEach(async item => {
const game = jsonObj.items.item[item]
const gameID = game["#_objectid"]
const rating = game.stats.rating["#_value"]
if (rating === "N/A") return
const gameObject = await gameObjects.find(
game => game.bgg === parseInt(gameID)
)
if (gameObject && gameObject.rating !== parseInt(rating)) {
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true }
).exec()
changes.push(
`${updated.name}: ${gameObject.rating} -> ${updated.rating}`
)
} catch (error) {
console.log(error)
}
}
})
return changes
}
Everything works – the changes are found and the database is updated, but the reported changes are returned too late, because the execution doesn't wait for Mongoose.
I've also tried this instead of findOneAndUpdate():
const updated = await Game.findOne()
.where("_id")
.in([gameObject.id])
.exec()
updated.rating = rating
await updated.save()
The same results here: everything else works, but the async doesn't.
As #Puneet Sharma mentioned, you'll have to map instead of forEach to get an array of promises, then await on the promises (using Promise.all for convenience) before returning changes that will then have been populated:
const parseJSON = async xmlData => {
const changes = []
const games = await Game.find({})
const gameObjects = games.map(game => {
return new GameObject(game.name, game.id, game)
})
const jsonObj = require("../sample.json")
const promises = Object.keys(jsonObj.items.item).map(async item => {
const game = jsonObj.items.item[item]
const gameID = game["#_objectid"]
const rating = game.stats.rating["#_value"]
if (rating === "N/A") return
const gameObject = await gameObjects.find(
game => game.bgg === parseInt(gameID)
)
if (gameObject && gameObject.rating !== parseInt(rating)) {
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true }
).exec()
changes.push(
`${updated.name}: ${gameObject.rating} -> ${updated.rating}`
)
} catch (error) {
console.log(error)
}
}
})
await Promise.all(promises)
return changes
}
(The diff, for convenience:
9,10c9,10
< let jsonObj = require("../sample.json")
< Object.keys(jsonObj.items.item).forEach(async item => {
---
> const jsonObj = require("../sample.json")
> const promises = Object.keys(jsonObj.items.item).map(async item => {
33a34
> await Promise.all(promises)
)
EDIT: a further refactoring would be to use that array of promises for the change descriptions themselves. Basically changePromises is an array of Promises that resolve to a string or null (if there was no change), so a .filter with the identity function will filter out the falsy values.
This method also has the advantage that changes will be in the same order as the keys were iterated over; with the original code, there's no guarantee of order. That may or may not matter for your use case.
I also flipped the if/elses within the map function to reduce nesting; it's a matter of taste really.
Ps. That await Game.find({}) will be a problem when you have a large collection of games.
const parseJSON = async xmlData => {
const games = await Game.find({});
const gameObjects = games.map(game => new GameObject(game.name, game.id, game));
const jsonGames = require("../sample.json").items.item;
const changePromises = Object.keys(jsonGames).map(async item => {
const game = jsonGames[item];
const gameID = game["#_objectid"];
const rating = game.stats.rating["#_value"];
if (rating === "N/A") {
// Rating from data is N/A, we don't need to update anything.
return null;
}
const gameObject = await gameObjects.find(game => game.bgg === parseInt(gameID));
if (!(gameObject && gameObject.rating !== parseInt(rating))) {
// Game not found or its rating is already correct; no change.
return null;
}
try {
const updated = await Game.findOneAndUpdate(
{ _id: gameObject.id },
{ rating: rating },
{ new: true },
).exec();
return `${updated.name}: ${gameObject.rating} -> ${updated.rating}`;
} catch (error) {
console.log(error);
}
});
// Await for the change promises to resolve, then filter out the `null`s.
return (await Promise.all(changePromises)).filter(c => c);
};

nodeJS: how to call an async function within a loop in another async function call

I am trying to call one async function from inside a loop run by another async function. These functions call APIs and I am using request-promise using nodeJS.
functions.js file
const rp = require("request-promise");
// function (1)
async email_views: emailId => {
let data = {};
await rp({
url: 'myapiurl',
qs: { accessToken: 'xyz', emailID: emailId },
method: 'GET'
})
.then( body => { data = JSON.parse(body) })
.catch( error => { console.log(error} );
return data;
};
The above JSON looks like this:
...
data:{
records: [
{
...
contactID: 123456,
...
},
{
...
contactID: 456789,
...
}
]
}
...
I am running a loop to get individual record, where I am getting a contactID associated with each of them.
// function#2 (also in functions.js file)
async contact_detail: contactId => {
let data = {};
await rp({
url: 'myapiurl2',
qs: { accessToken: 'xyz', contactID: contactId },
method: 'GET'
})
.then( body => { data = JSON.parse(body) })
.catch( error => { console.log(error} );
return data;
};
The above function takes one contactId as parameter and gets that contact's detail calling another API endpoint.
Both functions work fine when they are called separately. But I am trying to do it inside a loop like this:
...
const result = await email_views(99999); // function#1
const records = result.data.records;
...
let names = "";
for( let i=0; i<records.length; i++) {
...
const cId = records[i].contactID;
const contact = await contact_detail(cId); // function#2
names += contact.data.firstName + " " + contact.data.lastName + " ";
...
}
console.log(names);
...
The problem is I am only getting the first contact back from the above code block, i.e. even I have 20 records from function#1, in the loop when I am calling contact_detail (function#2) for each contactID (cId), I get contact detail once, i.e. for the first cId only. For rest I get nothing!
What is the correct way to achieve this using nodeJs?
UPDATE:
const { App } = require("jovo-framework");
const { Alexa } = require("jovo-platform-alexa");
const { GoogleAssistant } = require("jovo-platform-googleassistant");
const { JovoDebugger } = require("jovo-plugin-debugger");
const { FileDb } = require("jovo-db-filedb");
const custom = require("./functions");
const menuop = require("./menu");
const stateus = require("./stateus");
const alexaSpeeches = require("./default_speech");
const app = new App();
app.use(new Alexa(), new GoogleAssistant(), new JovoDebugger(), new FileDb());
let sp = "";
async EmailViewsByContactIntent() {
try {
const viewEmailId =
this.$session.$data.viewEmailIdSessionKey != null
? this.$session.$data.viewEmailIdSessionKey
: this.$inputs.view_email_Id_Number.value;
let pageIndex =
this.$session.$data.viewEmailPageIndex != null
? this.$session.$data.viewEmailPageIndex
: 1;
const result = await custom.email_views_by_emailId(
viewEmailId,
pageIndex
);
const records = result.data.records;
if (records.length > 0) {
const totalRecords = result.data.paging.totalRecords;
this.$session.$data.viewEmailTotalPages = totalRecords;
sp = `i have found a total of ${totalRecords} following view records. `;
if (totalRecords > 5) {
sp += `i will tell you 5 records at a time. for next 5 records, please say, next. `;
this.$session.$data.viewEmailIdSessionKey = this.$inputs.view_email_Id_Number.value;
this.$session.$data.viewEmailPageIndex++;
}
for (let i = 0; i < records.length; i++) {
const r = records[i];
/* Here I want to pass r.contactID as contactId in the function contact_detail like this: */
const contact = await custom.contact_detail(r.contactID);
const contact_name = contact.data.firstName + " " + contact.data.lastName;
/* The above two lines of code fetch contact_name for the first r.contactID and for the rest I get an empty string only. */
const formatted_date = r.date.split(" ")[0];
sp += `contact ID ${spellOut_speech_builder(
r.contactID
)} had viewed on ${formatted_date} from IP address ${
r.ipAddress
}. name of contact is, ${contact_name}. `;
}
if (totalRecords > 5) {
sp += ` please say, next, for next 5 records. `;
}
} else {
sp = ``;
}
this.ask(sp);
} catch (e) {
this.tell(e);
}
}
I am building an alexa skill using JOVO framework and nodeJS.
UPDATE #2
As a test, I only returned the contactId which I am passing to the contact_detail function and I am getting the correct value back to the above code under my first UPDATE.
async contact_detail: contactId => {
return contactId;
}
It seems even after getting the value right, the function is somehow failing to execute. However, the same contact_detail function works perfectly OK, when I am calling it from another place. Only doesn't not work inside a loop.
What could be the reason?
I must be missing something but don't know what!
You are mixing async await and promises together which is causing you confusion. You typically would use one of the other(as async await effectivly provides syntax sugar so you can avoid dealing with the verbose promise code) in a given location.
Because you mixed the two you are in a weird area where the behavior is harder to nail down.
If you want to use async await your functions should look like
async contact_detail: contactId => {
try {
const body = await rp({
url: 'myapiurl2',
qs: { ... }
});
return JSON.parse(body);
} catch(e) {
console.log(e);
//This will return undefined in exception cases. You may want to catch at a higher level.
}
};
or with promises
async contact_detail: contactId => {
return rp({
url: 'myapiurl2',
qs: { ... }
})
.then( body => JSON.parse(body))
.catch( error => {
console.log(error);
//This will return undefined in exception cases. You probably dont want to catch here.
});
};
Keep in mind your current code executing the function will do each call in series. If you want to do them in parallel you will need to call the function a bit differently and use something like Promise.all to resolve the result.
Here you go:
...
const result = await email_views(99999); // function#1
const records = result.data.records;
...
let names = "";
await Promise.all(records.map(async record => {
let cId = record.contactID;
let contact = await contact_detail(cId);
names += contact.data.firstName + " " + contact.data.lastName + " ";
});
console.log(names);
...
I'm posting this as an answer only because I need to show you some multi-line code as part of throubleshooting this. Not sure this solves your issue yet, but it is a problem.
Your contact_detail() function is not properly returning errors. Instead, it eats the error and resolves with an empty object. That could be what is causing your blank names. It should just return the promise directly and if you want to log the error, then it needs to rethrow. Also, there's no reason for it to be declared async or to use await. You can just return the promise directly. You can also let request-promise parts the JSON response for you too.
Also, I notice, there appears to be a syntax error in your .catch() which could also be part of the problem.
contact_detail: contactId => {
return rp({
url: 'myapiurl2',
qs: { accessToken: 'xyz', contactID: contactId },
json: true,
method: 'GET'
}).catch( error => {
// log error and rethrow so any error propagates
console.log(error);
throw error;
});
};
Then, you would call this like you originally were (note you still use await when calling it because it returns a promise):
...
const result = await email_views(99999); // function#1
const records = result.data.records;
...
let names = "";
for( let i=0; i<records.length; i++) {
...
const cId = records[i].contactID;
const contact = await contact_detail(cId);
names += contact.data.firstName + " " + contact.data.lastName + " ";
...
}
console.log(names);
...

DynamoDB scan shows fewer items than DynamoDB console

Why does a scan with nodejs shows just 3 result, while the dynamodb admin tool shows 9
var params = {
TableName: process.env.DYNAMODB_TABLE_LIGHTHOUSE,
FilterExpression: '#blickArticleId = :valblickArticleId AND #firstFolder = :valfirstFolder',
ExpressionAttributeNames: {
'#blickArticleId': 'blickArticleId',
'#firstFolder': 'firstFolder'
},
ExpressionAttributeValues: {
':valblickArticleId': 'null',
':valfirstFolder': 'null'
},
};
const queryResponse = await dynamoDb.scan(params).promise()
isnt that the same
Are you sure , your scanned content is not more than 1MB ?.
If the total number of scanned items exceeds the maximum data set size
limit of 1 MB, the scan stops and results are returned to the user
with a LastEvaluatedKey
Then you can scan the remaining items by using LastEvaluatedKey.
as jarmod mentioned pagination is the solution:
const getLalalalalal = async () => {
var params = {
TableName: process.env.DYNAMODB_TABLE_LIGHTHOUSE,
FilterExpression: '#blickArticleId = :valblickArticleId AND #firstFolder = :valfirstFolder',
ExpressionAttributeNames: {
'#blickArticleId': 'blickArticleId',
'#firstFolder': 'firstFolder'
},
ExpressionAttributeValues: {
':valblickArticleId': 'null',
':valfirstFolder': 'null'
},
};
return await scanTable(params)
}
const scanTable = async (params) => {
let scanResults = [];
let items;
do {
items = await dynamoDb.scan(params).promise();
items.Items.forEach((item) => scanResults.push(item));
params.ExclusiveStartKey = items.LastEvaluatedKey;
} while(typeof items.LastEvaluatedKey != "undefined");
return scanResults;
};
Your code needs to scan for the remaining items. The presence of a non-null LastEvaluatedKey indicates that DynamoDB results are paginated. The AWS DynamoDB console is presumably doing the pagination for you.
Here is an example of code to fetch all items if paginated.

Resources