"Missing required key 'Key' in params" in Get operation of Dynamo dB - node.js

I am writing Lambda function in node.js to getitems from dynamodB. Table is employee where emo_Id is the Partition key. Below is the code snippet I am writing:
var table = "Employee_Test";
var emp_Id=event.emp_Id;
var emp_Name=event.emp_Name;
var params = {
TableName: table,
KeyConditionExpression: "#eId = :Id",
ExpressionAttributeNames:{
"#eId": "emp_Id"
},
ExpressionAttributeValues: {
":Id":emp_Id
}}
The error I am getting is :
"message": "Missing required key 'Key' in params",
"code": "MissingRequiredParameter",
I know the resolution of the error is to add:
Key:{
"emp_Id": emp_Id,
} to the code. But If I have to query the employees who have joined after a particular date then I cannot provide emp_Id as a parameter.
In the AWS release notes I have found that we can disable parameter validation,
https://aws.amazon.com/releasenotes/6967335344676381 I tried this but this is also not working.
Can somebody please help?
Thanks
Shweta

I was hit with a same error when querying the secondary indexes. Turns out that I was using the wrong API. Confused between getItem and Query.

I ran into this when I first started with DynamoDb. Such an annoying error. Turns out I had accidentally used the .get method, from a previous working getById example, instead of the .query method.
In short, you may just need to change this ...
const response = await db.get(query).promise();
... to this ...
const response = await db.query(query).promise();

Add a Global Secondary Index to your table to enable lookups by start date. First, change your item creation code (PutItem) to add an attribute representing the month and year an employee joined, like joinYearMonth=201612. Second, scan your table to find items that do not already have this attribute and add it. Third, create a Global Secondary Index with a partition key of joinYearMonth and a sort key of joinTimestamp. This way, you can issue query requests on the GSI for the years and months you require to find those that joined.

Related

Proper Sequelize flow to avoid duplicate rows?

I am using Sequelize in my node js server. I am ending up with validation errors because my code tries to write the record twice instead of creating it once and then updating it since it's already in DB (Postgresql).
This is the flow I use when the request runs:
const latitude = req.body.latitude;
var metrics = await models.user_car_metrics.findOne({ where: { user_id: userId, car_id: carId } })
if (metrics) {
metrics.latitude = latitude;
.....
} else {
metrics = models.user_car_metrics.build({
user_id: userId,
car_id: carId,
latitude: latitude
....
});
}
var savedMetrics = await metrics();
return res.status(201).json(savedMetrics);
At times, if the client calls the endpoint very fast twice or more the endpoint above tries to save two new rows in user_car_metrics, with the same user_id and car_id, both FK on tables user and car.
I have a constraint:
ALTER TABLE user_car_metrics DROP CONSTRAINT IF EXISTS user_id_car_id_unique, ADD CONSTRAINT user_id_car_id_unique UNIQUE (car_id, user_id);
Point is, there can only be one entry for a given user_id and car_id pair.
Because of that, I started seeing validation issues and after looking into it and adding logs I realize the code above adds duplicates in the table (without the constraint). If the constraint is there, I get validation errors when the code above tries to insert the duplicate record.
Question is, how do I avoid this problem? How do I structure the code so that it won't try to create duplicate records. Is there a way to serialize this?
If you have a unique constraint then you can use upsert to either insert or update the record depending on whether you have a record with the same primary key value or column values that are in the unique constraint.
await models.user_car_metrics.upsert({
user_id: userId,
car_id: carId,
latitude: latitude
....
})
See upsert
PostgreSQL - Implemented with ON CONFLICT DO UPDATE. If update data contains PK field, then PK is selected as the default conflict key. Otherwise, first unique constraint/index will be selected, which can satisfy conflict key requirements.

Azure CosmosDB/Nodejs - Entity with the specified id does not exist in the system

I am trying to delete and update records in cosmosDB using my graphql/nodejs code and getting error - "Entity with the specified id does not exist in the system". Here is my code
deleteRecord: async (root, id) => {
const { resource: result } = await container.item(id.id, key).delete();
console.log(`Deleted item with id: ${id}`);
},
Somehow below code is not able to find record, even "container.item(id.id, key).read()" doesn't work.
await container.item(id.id, key)
But if I try to find record using query spec it works
await container.items.query('SELECT * from c where c.id = "'+id+'"' ).fetchNext()
FYI- I am able to fetch all records and create new item, so Connection to DB and reading/writing is not an issue.
What else can it be? Any pointer related to this will be helpful.
Thanks in advance.
It seems you pass the wrong key to item(id,key). According to the Note of this documentation:
In both the "update" and "delete" methods, the item has to be selected
from the database by calling container.item(). The two parameters
passed in are the id of the item and the item's partition key. In this
case, the parition key is the value of the "category" field.
So you need to pass the value of your partition key, not your partition key path.
For example, if you have document like below, and your partition key is '/category', you need to use this code await container.item("xxxxxx", "movie").
{
"id":"xxxxxx",
"category":"movie"
}

Dynamodb putItem written twice

I am new to AWS and I feel like I am missing something important.
I am using this code from a lambda function in nodeJS to create an entry in a DynamoDB table :
function recordUser(item) {
return ddb.putItem({
TableName: 'Users',
Item: item,
Expected: {
username: { Exists: false }
}
}).promise();
}
username is the primary key of my table.
I though the condition would restrain duplicates to appear but I still see some duplicated entries with same username, what am I missing ?
You are giving "Expected" a wrong interpretation... You seemed to hope that it checks whether there is any existing item in the database with the given value for the "username" attribute. But this is not what Expected does... It does something very different: It reads one specific item - the item with the same key as the one you specified in "Item", and then check whether for this specific item, a value (any value!) exists for its "username" attribute.
To suggest how to fix your use case, we would need to know more about your data. The easiest solution is, of course, to have a table whose sole key is "username", which will allow just one item per username. But I don't know if this is good enough for your usecase.

Query condition missed key schema element : Validation Error

I am trying to query dynamodb using the following code:
const AWS = require('aws-sdk');
let dynamo = new AWS.DynamoDB.DocumentClient({
service: new AWS.DynamoDB(
{
apiVersion: "2012-08-10",
region: "us-east-1"
}),
convertEmptyValues: true
});
dynamo.query({
TableName: "Jobs",
KeyConditionExpression: 'sstatus = :st',
ExpressionAttributeValues: {
':st': 'processing'
}
}, (err, resp) => {
console.log(err, resp);
});
When I run this, I get an error saying:
ValidationException: Query condition missed key schema element: id
I do not understand this. I have defined id as the partition key for the jobs table and need to find all the jobs that are in processing status.
You're trying to run a query using a condition that does not include the primary key. This is how queries work in DynamoDB. You would need to do a scan for the info in your case, however, I don't think that is the best option.
I think you want to set up a global secondary index and use that to query for the processing status.
In another answer #smcstewart responded to this question. But he provides a link instead of commenting why this error occurs. I want to add a brief comment hoping it will save your time.
AWS docs on Querying a Table states that you can do WHERE condition queries (e.g. SQL query SELECT * FROM Music WHERE Artist='No One You Know') in the DynamoDB way, but with one important caveat:
You MUST specify an EQUALITY condition for the PARTITION key, and you can optionally provide another condition for the SORT key.
Meaning you can only use key attributes with Query. Doing it in any other way would mean that DynamoDB would run a full scan for you which is NOT efficient - less efficient than using Global secondary indexes.
So if you need to query on non-key attributes using Query is usually NOT an option - best option is using Global Secondary Indexes as suggested by #smcstewart.
I found this guide to be useful to create a Global secondary index manually.
If you need to add it using CloudFormation here is a relevant page.
I was getting this error for a different scenario. Here is my scenario.
(It's very unlikely that anyone else ends up with this case, but incase)
I had a query working on a Table (say table A). Table A had a partition key m_id and sort key u_id.
I had a query to fetch data using m_id. The query was working.
'''
var queryParams = {
ExpressionAttributeValues: {
':m_id': mId
},
KeyConditionExpression: 'm_id = :m_id',
TableName: "A"
};
let connections = await docClient.query(queryParams).promise();
'''
I created another Table say Table B. I made some errors in naming keys so I simply deleted and created a table with the same name again, Table B. Table B had partition key m_id, and sort key s_id.
I copied pasted the same query which I was using for Table A, I changed Table name only because partition key had the same name.
To my shock, I get this expectation.
"ValidationException: Query condition missed key schema element"
I rechecked all the names, I compared the query with the working query. Everything was fine.
I thought maybe because, I was deleting recreating Table B, it could be something with that. So I create a fresh Table with a new Name Table B2 with the same key names as Table B.
In my query that was throwing exceptions, I changed only the Table name from B to B2.
And the Exception was gone.
If you are getting this on a fresh table, where no query has worked earlier, creating a new Table with a new name is an option.
If you delete a Table only to change partition key names, it may be safer to use a new name for Table as well (Dynamo could be referring metadata by table names and not by internal identifiers, it is possible that old metadata stays even if you delete a table. Just a guess given I faced this case).
EDIT:2022-July-12
This error does not leave me. My own answer was helpful but one more case, there was a trailing space in name of Key in the table. And Dynamo does not even check for spaces in key names.
You have to create an global secondary index for the status field.
Then, you code could look like smth like this:
dynamo.query({
TableName: "Jobs",
IndexName: 'status',
KeyConditionExpression: '#s = :st',
ExpressionAttributeValues: {
':st': 'processing'
},
ExpressionAttributeNames: {
'#s': 'status',
},
}, (err, resp) => {
console.log(err, resp);
});
Note: scan operation is indeed very costly, especially if you table is huge in size
i solved the problem using AWS.DynamoDB.DocumentClient() with scan, for sample (nodejs):
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName: "product",
FilterExpression: "#cg = :data",
ExpressionAttributeNames: {
"#cg": "categoria",
},
ExpressionAttributeValues: {
":data": category,
}
};
docClient.scan(params, onScan);
function onScan(err, data) {
if (err) {
// for the log in server
console.error("Unable to scan the table. Error JSON:", JSON.stringify(err, null, 2));
res.json(err);
} else {
console.log("Scan succeeded.");
res.json(data);
}
}

Query Dynamo from Lambda using Node - missed key

I am using Lambda (nodeJS 4.3) to query my DynamoDB with the following:
var params = {
TableName : "shoes",
KeyConditionExpression: "gender = :gender AND support = :support AND terrain = :terrain",
ExpressionAttributeValues: {
":input": inputGender,
":input": inputSupport,
":input": inputTerrain
}
}
When this runs I am getting an error that I am missing "Query condition missed key schema element: Id". It may just be a fundamental misunderstanding on my part, but if I want to query 2 or more of the fields in the DynamoDB do I need to make them a key or create indexes on all of them??
Thanks in advance.
A query must include the partition key of the table (or a global secondary index). Your table's partition key is id and you didn't include that in the query. Given the query you are trying to run I don't think it makes sense to create a GSI on your table. You will need to perform a full table scan operation instead of a query operation.

Resources