The provided key element does not match the schema - Javascript - node.js

I'm generating some data and trying to insert it in the database.
const AWSXRay = require('aws-xray-sdk');
const AWS = AWSXRay.captureAWS(require('aws-sdk'));
const documentClient = new AWS.DynamoDB.DocumentClient();
module.exports = {
batchWriteItems: async function (tableName, headers, data) {
// AWS only allows batches of max 25 items
while (data.length) {
const batch = data.splice(0, 25);
const putRequests = batch.map((elem) => {
// Overide field type (default is String)
let items = {};
for (let key in elem) {
const header = headers.find((hdr) => hdr.name === key);
items[key] = {
[header.type]: elem[key],
}
}
return {
PutRequest: {
Item: items,
},
};
});
const params = {
RequestItems: {
[tableName]: putRequests,
},
};
console.log(JSON.stringify(params));
await documentClient.batchWrite(params).promise();
}
},
};
The table is correct, Partition key is countryCode (S) and there is no sort key.
If I copy the items individually and create an item manually in DynamoDB, it works!
I don't see any issues here but I'm still getting this error:
The provided key element does not match the schema
console.log(JSON.stringify(params));
{
"RequestItems":{
"test-table":[
{
"PutRequest":{
"Item":{
"countryCode":{
"S":"LX"
},
"value":{
"N":"0.67"
}
}
}
},
{
"PutRequest":{
"Item":{
"countryCode":{
"S":"AF"
},
"value":{
"N":"0.68"
}
}
}
}
]
}
}
Do you see any issue here? I'm kinda stuck

Related

How to run graphql query in a loop until condition is not matched

I am trying to run a graphql query to retrieve all products from shopify graphql api, but we can not query all data at once due to limited rate limit. So how can I query all data without facing the rate limit problem and effectively?
Any help would be greatly appreciated
const myproducts = await axios({
url: `https://${req.session.shop}/admin/api/2022-01/graphql.json`,
method: 'post',
headers: {
"X-Shopify-Access-Token": req.session.token,
},
data: {
query: `
query ($numProducts: Int!) {
products(first: $numProducts) {
edges {
cursor
node {
id
title
}
}
pageInfo {
hasNextPage
}
}
}
`,
variables: {
numProducts: 3
}
},
});
If it is for Shopify, you should definitely check their documentation about pagination: https://shopify.dev/api/usage/pagination-graphql
You can use a "while loop" to iterate until a condition is met, in this case when "hasNextPage" is false. Here is an example of how I get all the products from a collection:
//Query
const getProdCollection =
`query nodes($ids: [ID!]!, $cursor: String) {
nodes(ids: $ids) {
... on Collection {
id
products(first: 50, after: $cursor) {
pageInfo {
hasNextPage
endCursor
}
edges {
cursor
node {
title
handle
id
descriptionHtml
}
}
}
}
}
}`
app.post("/api/getcollection", async (req, res) => {
const session = await Shopify.Utils.loadCurrentSession(
req,
res,
app.get("use-online-tokens")
);
if (!session) {
res.status(401).send("Could not find a Shopify session");
return;
}
const client = new Shopify.Clients.Graphql(
session.shop,
session.accessToken
);
//Set cursor to null for first iteration
let cursor = null;
let allData = [];
while (true) {
// place you looping request here
let prodCollection = await client.query({
data: {
query: getProdCollection,
variables: {
ids: [req.body['data']],
cursor: cursor
},
},
});
//update nextPage and curpor
let nextPage = prodCollection.body.data.nodes[0].products.pageInfo.hasNextPage;
cursor = prodCollection.body.data.nodes[0].products.pageInfo.endCursor;
let myProductData = prodCollection.body.data.nodes[0].products.edges;
allData = allData.concat(myProductData);
//Finish the loop here
if (nextPage == false) {
break;
}
}
//send back data
res.send(allData);
});

How to make a transaction in cosmos DB with NodeJS

** Edited **
I am trying to insert data to Azure cosmos DB if its not exist but can't find a way to do it in one transaction.
the current way I'm doing it is by first getting it from cosmos and if it's throwing Not found exception then insert, otherwise update it.
the issue with that is that between the get and insert someone can push an object so this approach is unsafe.
code snippet:
static async fetchOrgConfigurationTableByQuery(orgId: string): Promise<OrgConfigurationTable> {
const { container } = await getContainer(Constants.CosmosRegional, Constants.OrgConfigurationTableName);
const orgConfigurations = await container.items.query(`SELECT * from c WHERE c.OrgId = "${orgId}"`).fetchAll();
const orgConfiguration = orgConfigurations.resources.find(() => true) as OrgConfigurationTable;
if (!orgConfiguration) {
return;
}
const orgConfigurationWithDefaults = new OrgConfigurationTable();
for (const [key] of Object.entries(orgConfiguration)) {
orgConfigurationWithDefaults[key] = orgConfiguration[key];
}
return orgConfigurationWithDefaults;
}
static async upsertOrgConfiguration(
orgConfiguration: OrgConfigurationTable,
failOnEtagMismatch = false
): Promise<OrgConfigurationTable> {
const requestOptions = getRequestOptions(orgConfiguration, failOnEtagMismatch);
const { container } = await getContainer(Constants.CosmosRegional, Constants.OrgConfigurationTableName);
const returnValue = await container.items.upsert<OrgConfigurationTable>(orgConfiguration, requestOptions);
return returnValue?.resource;
}
what I have:
let orgConfig: Cosmos.OrgConfigurationTable = null;
try {
orgConfig = await fetchOrgConfigurationTableByQuery(orgId);
} catch (error) {
if (!error instanceof Cosmos.OrgNotFoundException) {
log(error)
}
}
orgConfig = {
...orgConfig,
attribute: value,
...(!orgConfig && { Key: { Value: orgId } }),
};
await upsertOrgConfiguration(orgConfig)
what I want - to dont get before upsert:
let orgConfig = {
attribute: value,
Key: { Value: orgId },
};
await upsertOrgConfiguration(orgConfig) // if exist update, else insert - in single transuction.
or some way to do somthing like this:
transaction {
let orgConfig: Cosmos.OrgConfigurationTable = null;
try {
orgConfig = await fetchOrgConfigurationTableByQuery(orgId);
} catch (error) {
if (!error instanceof Cosmos.OrgNotFoundException) {
log(error)
}
}
await upsertOrgConfiguration(orgConfig)
}
commit
any ideas?

I need help to get a simple query for MongoDB (SQL equivalent: "select speed from values")

I need help to get a simple query for MongoDB (SQL equivalent: "select speed from values"). But I can't find a solution for that.
Node.js backend for Vue.js
// a Part of Init-function:
await collection.replaceOne({}, {
speed: {
value: 65,
type: 2,
},
eightSpeed: {
value: 0.0043,
type: 2,
},
oSpeed: {
value: 0.31,
type: 2,
},
minusSpeed: {
value: 2.42,
type: 2,
}
}
//...
Now I need the query like this: http://192.168.220.220:3000/settings/getValue/speed to get an object speed.
const express = require("express");
const mongodb = require("mongodb");
const SettingsRouter = new express.Router();
SettingsRouter.get("/getValue/:value", async function (req, res) {
try {
const val = req.params.value; // req.body.text;
const select = {};
select[`${val}.value`] = 1;
console.log("settings.js:", "/getValue/:name", select);
const collection = await loadMongoCollection();
const value = await collection.findOne({},select)//.toArray();
res.status(201).send(value);
} catch (error) {
res.status(500).send("Failed to connect Database");
console.error("settings.js:", "/getValue/:name:", error);
}
});
async function loadMongoCollection() {
const dbUri = process.env.MONGODB_CONNSTRING || config.dbUri;
const MongoDB = process.env.MONGODB_DB || config.db;
try {
const client = await mongodb.MongoClient.connect(dbUri, {
useNewUrlParser: true,
});
const conn = client.db(MongoDB).collection("values");
return conn;
} catch (error) {
console.error("settings.js:", "loadMongoCollection:", error);
}
}
When I try it, I get all (not only the Speed Object) what I expected, or nothing.
What I do wrong?
EDIT 2022.01.06:
I try it to change my database:
data = {
speed: {
value: 65,
type: 2,
},//...
}
//...
await collection.replaceOne({}, {values:data}, { upsert: true });
And then the query:
const select = {[`values.${val}.value`]:"1"};
const where = {}; //{"values":val};
const value = await collection.find(where,select,).toArray();
But it will not work in rest... is there an issue in mongo package?
When I do it in https://cloud.mongodb.com - it works:
But my request returns all values...
Log shows:"'settings.js:', '/getValue/:name', { 'values.speed.value': '1' }"
In the screenshot you show the projection is values.speed.value, but in your endpoint you have:
const select = {};
select[`${val}.value`] = 1;
Which would evaluate to speed.value. To mimic what you have on the MongoDB console this should be:
const select = {};
select[`values.${val}.value`] = 1;
So, I changed my Frondend (vue) to spred all Values. I maked some Helper Functions:
CheckForChangedObj
check two object for changes
updateObj
copy all data from object into another
export function CheckForChangedObj(targetObject, obj) {
if (targetObject === typeof undefined || targetObject == {}) {
return true
}
if (JSON.stringify(targetObject) !== JSON.stringify(obj)) {
return true
}
return false
}
export function updateObj(targetObject, obj) {
let keys = Object.keys(obj)
for (let k in keys) {
try {
let key = keys[k]
if (!targetObject.hasOwnProperty(key)) {
//define target, if not exist
targetObject[key] = {}
}
//Workaround for References
if (obj[key].hasOwnProperty("__v_isRef")) {
if (targetObject[key].hasOwnProperty("__v_isRef")) {
targetObject[key].value = obj[key].value
} else {
targetObject[key] = obj[key].value
}
} else {
//Source i a Norm Value
//check for an Object inside, then go in...
if ("object" === typeof obj[key] && !Array.isArray(obj[key])) {
updateObj(targetObject[key], obj[key]) // recurse
} else {
//not else-if!!! __v_isRef: true
if (targetObject[key].hasOwnProperty("__v_isRef")) {
targetObject[key].value = obj[key]
} else {
targetObject[key] = obj[key]
}
}
}
} catch (error) {
console.log(
"key:",
keys[k],
"Error:",
error
)
}
}
}
Then spreading...
import {updateObj,CheckForChangedObj } from "../tools/tools"
beforeMount() {
this.getAllValuesAPI()
setInterval(() => {
this.checkForChanges()
// ml("TABLE", this.values)
}, 10000)
},
setup() {
let prevValues = {}
let values = {
speed: {
id:1,
type: SettingType.Slider,
value: ref(0)
}
//...
}
function checkForChanges() {
//intervaled function to get changes from API
let changed = CheckForChangedObj(this.values, this.prevValues)
if (changed) {
updateObj(this.prevValues, this.values)
setValuesToAPI()
}
else{
getAllValuesFromAPI()
}
}
async function getAllValuesFromAPI() {
//get ALL Settings-Data from API, and put them in values
const res = await axios.get(`settings/getAll`)
return updateObj(this.values, res.data[0].values) u
}
}
When you have a better solution, please help...

Data Item not getting inserted in DynamoDB Table from AWS Lambda Trigger

I have this Lambda code setup in JS for inserting a row in DynamoDB but the code seems to have some issue and doesn't trigger.Unable to figure out the issue as cloud watch also does captures anything. The API Gateway triggers the Lambda will inserts a row in DynamoDB. Tried debugging with console and logs also but unable to figure out the actual issue that is causing the code to break.
exports.handler = async (event) => {
console.log(event);
var eventBody = JSON.parse(event.body);
var eventType = eventBody.event_type;
var content = eventBody.content;
if (content.subscription) {
var subscription = content.subscription;
switch (eventType) {
case "subscription_created":
// Add new entry to License table
console.log('subscription created event');
var customer = content.customer;
var findUserScan = {
ExpressionAttributeValues: {
':email': { S: customer.email },
},
FilterExpression: 'email = :email',
TableName: _dynamodb_userprofile_table
};
try {
await dynamodb.scan(findUserScan, function (err, data) {
if (err) {
console.log("Error", err);
} else {
var userProfileAWS;
data.Items.forEach(function (userProfile, index, array) {
userProfileAWS = AWS.DynamoDB.Converter.unmarshall(userProfile);
});
var companyName;
if (userProfileAWS) {
companyName = userProfileAWS.organization;
}
try {
var licenseEntry = {
"subscriptionID": {
S: subscription.id
},
"companyName": {
S: companyName || ""
},
"customerEmail": {
S: customer.email
},
};
await dynamodb.putItem({
"TableName": _dynamodb_license_table,
"Item": licenseEntry
}).promise()
} catch (error) {
console.log(error)
throw new Error(`Error in dynamoDB: ${JSON.stringify(error)}`);
}
}
}).promise();
} catch (error) {
throw new Error(`Error in dynamoDB: ${JSON.stringify(error)}`);
}
break;

How do I batch delete with DynamoDB?

I am getting an error that "The provided key element does not match the schema". uuid is my primary partition key. I also have a primary sort key for version. I figured I can use batchWrite (docs) to delete all items with same uuid.
My ES6 code is as follows:
delete(uuid) {
const promise = new Promise();
const params = {
RequestItems: {
[this.TABLE]: [
{
DeleteRequest: {
Key: { uuid: uuid }
}
}
]
}
};
// this._client references the DocumentClient
this._client.batchWrite(params, function(err, data) {
if (err) {
// this gets hit with error
console.log(err);
return promise.reject(err);
}
console.log(result);
return promise.resolve(result);
});
return promise;
}
Not sure why it is erroring on the key that is the primary. I have seen posts about needing other indexes for times when I am searching by something that isn't a key. But I don't believe that's the case here.
Here is the batch write delete request sample. This code has been tested and working fine. If you change this code for your requirement, it should work.
Table Definition:-
Bag - Table Name
bag - Hash Key
No partition key in 'Bag' table
Batch Write Code:-
var AWS = require("aws-sdk");
AWS.config.update({
region : "us-west-2",
endpoint : "http://localhost:8000"
});
var documentclient = new AWS.DynamoDB.DocumentClient();
var itemsArray = [];
var item1 = {
DeleteRequest : {
Key : {
'bag' : 'b1'
}
}
};
itemsArray.push(item1);
var item2 = {
DeleteRequest : {
Key : {
'bag' : 'b2'
}
}
};
itemsArray.push(item2);
var params = {
RequestItems : {
'Bag' : itemsArray
}
};
documentclient.batchWrite(params, function(err, data) {
if (err) {
console.log('Batch delete unsuccessful ...');
console.log(err, err.stack); // an error occurred
} else {
console.log('Batch delete successful ...');
console.log(data); // successful response
}
});
Output:-
Batch delete successful ...
{ UnprocessedItems: {} }
This is doable with Node lambda, but there are a few things you need to consider to address concurrency while processing large databases:
Handle paging while querying all of the matching elements from a secondary index
Split into chunks of 25 requests as per BatchWrite/Delete requirements https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchWriteItem.html
Above 40,000 matches you might need a 1 second delay between cycles https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Limits.html
Here a snipped that I wrote:
const AWS = require("aws-sdk");
const dynamodb = new AWS.DynamoDB.DocumentClient();
const log = console.log;
exports.handler = async (event) => {
log(event);
let TableName = event.tableName;
let params = {
let TableName,
FilterExpression: "userId = :uid",
ExpressionAttributeValues: {
":uid": event.userId,
},
};
let getItems = async (lastKey, items) => {
if (lastKey) params.ExclusiveStartKey = lastKey;
let resp = await dynamodb.scan(params).promise();
let items = resp.Items.length
? items.concat(resp.Items.map((x) => x.id))
: items;
if (resp.LastEvaluatedKey)
return await getItems(resp.LastEvaluatedKey, items);
else return items;
};
let ids = await getItems(null, []);
let idGroups = [];
for (let i = 0; i < ids.length; i += 25) {
idGroups.push(ids.slice(i, i + 25));
}
for (const gs of idGroups) {
let delReqs = [];
for (let id of gs) {
delReqs.push({ DeleteRequest: { Key: { id } } });
}
let RequestItems = {};
RequestItems[TableName] = delReqs;
let d = await dynamodb
.batchWrite({ RequestItems })
.promise().catch((e) => log(e));
}
log(ids.length + " items processed");
return {};
};
Not sure why nobody provided a proper answer.
Here's a lambda I did in nodeJS. It will perform a full scan on the table, then batch delete every 25 items per request.
Remember to change TABLE_NAME.
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({ apiVersion: '2012-08-10' });
//const { TABLE_NAME } = process.env;
TABLE_NAME = "CHANGE ME PLEASE"
exports.handler = async (event) => {
let params = {
TableName: TABLE_NAME,
};
let items = [];
let data = await docClient.scan(params).promise();
items = [...items, ...data.Items];
while (typeof data.LastEvaluatedKey != 'undefined') {
params.ExclusiveStartKey = data.LastEvaluatedKey;
data = await docClient.scan(params).promise();
items = [...items, ...data.Items];
}
let leftItems = items.length;
let group = [];
let groupNumber = 0;
console.log('Total items to be deleted', leftItems);
for (const i of items) {
const deleteReq = {
DeleteRequest: {
Key: {
id: i.id,
},
},
};
group.push(deleteReq);
leftItems--;
if (group.length === 25 || leftItems < 1) {
groupNumber++;
console.log(`Batch ${groupNumber} to be deleted.`);
const params = {
RequestItems: {
[TABLE_NAME]: group,
},
};
await docClient.batchWrite(params).promise();
console.log(
`Batch ${groupNumber} processed. Left items: ${leftItems}`
);
// reset
group = [];
}
}
const response = {
statusCode: 200,
// Uncomment below to enable CORS requests
// headers: {
// "Access-Control-Allow-Origin": "*"
// },
body: JSON.stringify('Hello from Lambda!'),
};
return response;
};
Be aware, that you need to follow instructions:
src: https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchWriteItem.html
DeleteRequest - Perform a DeleteItem operation on the specified item. The item to be deleted is identified by a Key subelement:
Key - A map of primary key attribute values that uniquely identify the item. Each entry in this map consists of an attribute name and an attribute value. For each primary key, you must provide all of the key attributes. For example, with a simple primary key, you only need to provide a value for the partition key. For a composite primary key, you must provide values for both the partition key and the sort key.
For batch delete, we can use batchWrite with DeleteRequest. Here is an example, in this, we are providing tableName whose data is to be deleted, the payload is an array of ids which we need to remove.
In single request 25 items can be deleted.
const AWS = require('aws-sdk');
const dynamodb= new AWS.DynamoDB.DocumentClient({ apiVersion: '2012-08-10' });
const tableName = "PlayerData";
const payload = [{id=101}, {id=105}, {id=106}];
const deleteBatchData = async (tableName, payload, dynamodb) => {
try {
await dynamodb.batchWrite({
RequestItems: {
[tableName]: payload.map(item => {
return {
DeleteRequest: {
Key: {
id: item.id
}
}
};
})
}
}).
promise().
then((response) => {
return response;
})
.catch((err) => {
console.log("err ::", JSON.stringify(err))
});
} catch (err) {
console.log('Error in deleteBatchData ', err);
}
}
Why not use PartiQL. This approach is much more readable. (This too has a limit of 25 items per request just like BatchWriteITems)
// Import required AWS SDK clients and commands for Node.js.
import { BatchExecuteStatementCommand } from "#aws-sdk/client-dynamodb";
import { ddbDocClient } from "../libs/ddbDocClient.js";
const tableName = process.argv[2];
const movieYear1 = process.argv[3];
const movieTitle1 = process.argv[4];
const movieYear2 = process.argv[5];
const movieTitle2 = process.argv[6];
export const run = async (
tableName,
movieYear1,
movieTitle1,
movieYear2,
movieTitle2
) => {
try {
const params = {
Statements: [
{
Statement: "DELETE FROM " + tableName + " where year=? and title=?",
Parameters: [{ N: movieYear1 }, { S: movieTitle1 }],
},
{
Statement: "DELETE FROM " + tableName + " where year=? and title=?",
Parameters: [{ N: movieYear2 }, { S: movieTitle2 }],
},
],
};
const data = await ddbDocClient.send(
new BatchExecuteStatementCommand(params)
);
console.log("Success. Items deleted.", data);
return "Run successfully"; // For unit tests.
} catch (err) {
console.error(err);
}
};
run(tableName, movieYear1, movieTitle1, movieYear2, movieTitle2);

Resources