How to make child array pagination with node js and aws dynamo db - node.js

I want to make pagination of "car_types" using aws dynamo db and node js. I don't want to use js, Can we make it using dynamo db ? I want total items, total page, page size, current page and data in response.
{
"uid": "222-3333",
"car_types": [
{
"description": "fsdf",
"title": "sdfsd"
},
{
"description": "fdfdfdf",
"title": "dfdfd"
},
{
"description": "dasda",
"title": "asdas"
},
{
"description": "dasd",
"title": "asdas"
},
{
"description": "dasdasd",
"title": "asdas"
}
]
}
Aws Dynamo DB and Node js Code, Which I used to get result.
export function get_car_types_list(){
var params = {
TableName : "cms_cars",
KeyConditionExpression: "#uid = :uid",
ExpressionAttributeNames:{
"#uid": "uid"
},
ExpressionAttributeValues: {
":uid": "222-3333"
}
};
return docClient.query(params).promise()
.then(function(data) {
console.log(data);
return data;
}).catch( (err) => {
console.log('got Error', err);
});
}
I want Result using dynamo db query:
{
"totalItem":5,
"totalPage":1,
"pageSize":"1",
"currentPage":"1",
"car_types": [
{
"description": "fsdf",
"title": "sdfsd"
},
{
"description": "fdfdfdf",
"title": "dfdfd"
},
{
"description": "dasda",
"title": "asdas"
},
{
"description": "dasd",
"title": "asdas"
},
{
"description": "dasdasd",
"title": "asdas"
}
]
}

DynamoDb will return 1 mb data when scan/query is executed, also LastEvaluatedKey is added to result if there are any remaining data. If you pass ExclusiveStartKey: LastEvaluatedKey you can scan/query with pagination. I added some tweaks to your approach, it may help you.
Edit: You can limit the result by passing Limit: Number to your params. This will allow you to limit the returning item count and you can get more with LastEvaluatedKey.
export function get_car_types_list(LastEvaluatedKey){
var params = {
TableName : "cms_cars",
KeyConditionExpression: "#uid = :uid",
ExpressionAttributeNames:{
"#uid": "uid"
},
ExpressionAttributeValues: {
":uid": "222-3333"
},
Limit: 5,
};
if (LastEvaluatedKey) {
params.ExclusiveStartKey = LastEvaluatedKey;
}
return docClient.query(params).promise()
.then(function(data) {
console.log(data);
return data;
}).catch( (err) => {
console.log('got Error', err);
});
}

Related

Delete Method in Swagger mit SQLite3 und NodeJS

I have OPENAPI, which are build with Swagger. GET und POST methods works finei. but not DELETE.
index.ts
`
app.use("/deleteProduct/{id}", deleteProduct);
delete.ts
import { Router } from "express";
import { Database } from "sqlite3";
import database from "./databaseConnection";
const deleteProduct = Router()
function removeProduct(id: string, db: Database) {
console.log(id);
return new Promise((resolve, reject) => {
db.serialize(() => {
db.run(`DELETE FROM product WHERE product_id = ?`, id, (err) => {
if (err) {
reject (err);
}
resolve ("Success");
})
})
})
}
deleteProduct.delete("/:id", async (req, res) => {
try {
res.json(await removeProduct(req.params.id, database));
} catch (err) {
console.error(`Error removing the product`, err.message);
}
});
export default deleteProduct
`
after pressing "execute" button nothing happens..
Can someone please help me or give a hint, where a mistake can lie.
Here is a swagger.json. May there is something wrong
"/deleteProduct/{id}": {
"delete": {
"tags": ["Delete"],
"description": "Removes product from database",
"produces": "application/json",
"parameters": [
{
"in": "path",
"name": "id",
"description": "Id of the product",
"required": true,
"schema": {
"type": "integer"
}
}],
"responses": {
"200": {
"description": "Product was deleted",
"content": {
"application/json": {
"schema": {
"type": "array"
}

Remove object from nested array in MongoDB using NodeJS

I can see that this question should have been answered here, but the code simply doesn't work for me (I have tried multiple, similar variations).
Here is my data:
[{
"_id": {
"$oid": "628cadf43a2fd997be8ce242"
},
"dcm": 2,
"status": true,
"comments": [
{
"id": 289733,
"dcm": 2,
"status": true,
"clock": "158",
"user": "Nathan Field",
"dept": "IT",
"department": [],
"dueback": "",
"comment": "test 1"
},
{
"id": 289733,
"dcm": 2,
"status": true,
"clock": "158",
"user": "Nathan Field",
"dept": "IT",
"department": [],
"dueback": "",
"comment": "test 2"
}
],
"department": [],
"dueback": ""
}]
And here is my code
const deleteResult = await db.collection('status').updateOne(
{ "dcm": comments.dcm },
{ $pull: { "comments": { "id": comments.id } } },
{ upsert: false },
{ multi: true }
);
Absolutely nothing happens...
So the issue ended up being something to do with running multiple update operations within one function. I have a database connection function like this:
const withDB = async (operations, res) => {
try {
const client = await MongoClient.connect('mongodb://localhost:27017', { useNewUrlParser: true });
const db = client.db('collection');
await operations(db);
client.close();
} catch (error) {
res.status(500).json({ message: 'Error connecting to db', error });
}
}
And then I call this by using:
withDB(async (db) => {
await db.collection('status').updateMany(
{ "dcm": comments.dcm },
{ $pull: { "comments": { "id": comments.id } } },
{ multi: true }
);
});
The issue occurred it would seem because I had two of these update operations within one withDB function. I have multiple operations in other instances (update item, then fetch collection), but for some reason this caused an issue.
I created a separate call to the withDB function to perform the '$pull' (delete) request, and then updated the array with the new comments.
To check that there was nothing wrong with my actual query, I used Studio3T's IntelliShell feature. If I'd done that sooner I would saved myself a lot of time!

How to pull out object heading from an array

I have a JSON response structure like this
{
"_id": "620e97d76ca392a43097cca6",
"user": "620295cbd67ece90802d2522",
"orderId": "EnrL7C",
"Items": [
{
"product": {
"name": "Fresh Salad",
"id": "61f2911723ff35136c98ad3e"
},
"quantity": 1,
"price": 1250,
"_id": "620e97d76ca392a43097cca7"
},
],
}
But i want the product not to be an object, so it should look like this
{
"_id": "620e97d76ca392a43097cca6",
"user": "620295cbd67ece90802d2522",
"orderId": "EnrL7C",
"Items": [
{
"name": "Fresh Salad",
"id": "61f2911723ff35136c98ad3e",
"quantity": 1,
"price": 1250,
"_id": "620e97d76ca392a43097cca7"
},
],
}
This is my code responsible for the response output
exports.getOrder = (req,res) => {
Order.findOne({orderId: 'EnrL7C'})
.populate("Items.product", "name")
.exec((error, order) => {
if(error) return res.status(400).json({ error });
if (order) {
return res.json(order);
}else{
return res.json(['No order found']);
}
});
Sometimes when I'm too lazy to look up all the mongoose documentation and figure out what version I'm on etc, I use the .lean() to just convert it to a normal JS object, which I'm way more comfortable with.
exports.getOrder = (req, res) => {
Order.findOne({ orderId: "EnrL7C" })
.lean() // add lean
.populate("Items.product", "name")
.exec((error, order) => {
if (error) return res.status(400).json({ error });
if (order) {
// fix the structure in javascript
order.Items = order.Items.map((item) => {
const flat = {
...item.product,
...item,
};
delete flat.product;
return flat;
});
return res.json(order);
} else {
return res.json(["No order found"]);
}
});
};
Let me know if that doesn't work, so I can update the answer.

Find index of object in list in DynamoDB using Lambda

I am trying to use a Lambda script to find the index of an object in a list. As an example, I would like to pass in "userid": "041c9004" and "author": "J.K Rowling" and return index=1 from the books list. If the object does not exist within the books list, return index=-1. We can assume that there will be no duplicate entries.
The structure of the DynamoDB table looks like this. Userid is the primary key.
{
"books": [
{
"author": "J.R.R. Tolkien",
"title": "Lord of the Rings"
},
{
"author": "J.K Rowling",
"title": "Harry Potter"
},
{
"author": "George RR Martin",
"title": "A Song of Ice and Fire"
}
],
"isactive": true,
"ispublic": true,
"lastupdated": 1597690265,
"userid": "041c9004"
}
Here is what I have written of the Lambda function. It is returning index=-1.
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({region: 'us-east-1'});
exports.handler = function(event, context, callback){
var params = {
TableName: 'Users',
Key: {
userid: event.userid
}
};
docClient.get(params, function(err, data){
if(err) {
callback(err,null);
} else {
callback(null, data);
//trying to populate this variable with the correct index
var indexOfAuthor = data.Item.books.indexOf(event.author);
console.log('The index of the author is ' + indexOfAuthor);
}
});
};
Assuming event.author is just the author name, you could try:
data.Item.books.findIndex(i => i.author === event.author)

AWS CloudSearch Upload JSON: Value tag cannot be array or object

I Am running a lambda function (NodeJS) to upload some documents to AWS Cloud Search. I keep getting the following error.
{
"errorMessage": "{ [\"The value of tags cannot be a JSON array or object\"] }",
"errorType": "DocumentServiceException",
"stackTrace": [
"Object.extractError (/var/task/node_modules/aws-sdk/lib/protocol/json.js:48:27)",
"Request.extractError (/var/task/node_modules/aws-sdk/lib/protocol/rest_json.js:37:8)",
"Request.callListeners (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:105:20)",
"Request.emit (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:77:10)",
"Request.emit (/var/task/node_modules/aws-sdk/lib/request.js:678:14)",
"Request.transition (/var/task/node_modules/aws-sdk/lib/request.js:22:10)",
"AcceptorStateMachine.runTo (/var/task/node_modules/aws-sdk/lib/state_machine.js:14:12)",
"/var/task/node_modules/aws-sdk/lib/state_machine.js:26:10",
"Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:38:9)",
"Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:680:12)"
]
}
I have followed the document format of
var item = {
type: 'add',
id: key,
fields: {
userid: value.userId,
storyid: value.storyId,
description: value.description,
title: value.title,
type: 'xyz'
}
}
This is the code I am using to upload the data
exports.handle = function(e, ctx, cb) {
ctx.callbackWaitsForEmptyEventLoop = false;
var documentsBatch = e.data;
var params = {
contentType: 'application/json',
documents: JSON.stringify(documentsBatch)
};
var req = cloudsearchdomain.uploadDocuments(params, function(err, data) {
if (err){
// an error occurred
cb(err, null);
}else{
// successful response
}
});
req.send();
}
My stringified data when logged looks something similar to
[
{
"type": "add",
"id": "FpgAxxxxKrM4utxosPy23--KhO6FgvxK",
"fields": {
"userid": "FpgARscKlxaxutxosPy23",
"storyid": "-KhxbPpRP7REEK",
"description": "xyz 🔥 🔥",
"title": "umm",
"type": "story"
}
},
{
"type": "add",
"id": "FccccxosPy23--KiYbrrPjtJVk2bghO-W",
"fields": {
"userid": "FpgARfPy23",
"storyid": "-KiYbrfggO-W",
"description": "noo",
"title": "lalaa out",
"type": "story"
}
}
]
Can someone point me in the right direction?
The problem was with another JSON object which had an additional JSON attribute other than fields. Once I was able to find and remove it everything worked. There should be a lint-er for the same, or the SDK should throw a better exception.

Resources