I try to update an item in dynamodb by adding a condition, without passing the key in the parameters.
And as soon as my condition is true update. Is it possible to do this?
Below an example of an item:
{
"id" : "bcc2f32e-305e-4469-88e2-463724b5c6a9",
"name" : "toto",
"email" : "toto#titi.com"
}
Where email is unique for items.
I tested this code and it works :
const name= "updateName";
const params = {
TableName: MY_TABLE,
Key: {
id
},
UpdateExpression: 'set #name = :name',
ExpressionAttributeNames: { '#name': 'name' },
ExpressionAttributeValues: { ':name': name },
ReturnValues: "ALL_NEW"
}
dynamoDb.update(params, (error, result) => {
if (error) {
res.status(400).json({ error: 'Could not update Item' });
}
res.json(result.Attributes);
})
But i want to do something like this (replace the Key by conditionExpression):
const params = {
TableName: MY_TABLE,
UpdateExpression: 'set #name = :name',
ConditionExpression: '#email = :email',
ExpressionAttributeNames: {
'#name': 'name',
'#email': 'email'
},
ExpressionAttributeValues: {
':name': name,
':email': email
},
ReturnValues: "ALL_NEW"
}
dynamoDb.update(params, (error, result) => {
if (error) {
res.status(400).json({ error: 'Could not update User' });
}
res.json(result.Attributes);
})
But this code doesn't work.
Any ideas?
You cannot update an item in DynamoDB without using the entire primary key (partition key, and sort key if present). This is because you must specify exactly one record for the update. See the documentation here.
If you want to find an item using a field that is not the primary key, then you can search using a scan (potentially slow and expensive) or by using a Global Secondary Index (GSI) on that field. Either of these methods requires that you do a separate request to find the item in question, and then use its primary key to perform the update.
It sounds like you want to do an update that waits for a condition. That's not how DynamoDb works; it cannot wait for anything (except consistency, I suppose, but that's somewhat different). What you can do is make a request with a condition, and if it fails the condition (returning immediately), make the request again later. If you do this you'll need to be careful to backoff appropriately, or you might end up making a lot of requests very quickly.
The key is a required parameter when doing updates; the condition expression can be used in addition to providing the key, but can't be used instead of the key.
Also, I am not sure you fully understand what the conditionExpression is for - its not like the 'where' clause in an SQL update statement (i.e. update mytable set name='test' where email='myemail.com'.
Instead, logically the conditionExpression in an update would be more like:
update mytable set name='test' where key='12345' but only if quantity >0 - for example,
i.e. you are telling dynamodb the exact key of the record you want updated, and once it finds it it uses the condition expression to determine if the update should proceed - i.e. find the record with id=12345, and change the name to 'test', only of the quantity is greater than 0.
It does not use the conditionExpression to find records to update.
Related
I have a lambda function which queries dynamoDb table userDetailTable, and I want to filter only the entries whose timestamp(recorded in ms) has exceeded 1 day (86400000 ms) when subtracted from (new Date.getTime()). Can anyone suggest me the way for doing it in the right way ?
Dynamo Table has GSIndex as user_status which has value 'active' for all the entries and epoch_timestamp(timestamp in ms) as attribute used for filter expression.
In Lambda I am checking epoch_timestamp and trying to subtract epoch_timestamp with (new Date.getTime()) in the query, which I am not sure is even possible. Below is the code which has my query.
function getUserDetails(callback){
var params = {
TableName: 'userDetailTable',
IndexName: 'user_status-index',
KeyConditionExpression: 'user_status = :user_status',
FilterExpression: `expiration_time - ${new Date().getTime()} > :time_difference`,
ExpressionAttributeValues: {
':user_status': 'active',
':time_difference': '86400000' // 1 day in ms
}
};
docClient.query(params, function(err, data) {
if(err) {
callback(err, null)
} else{
callback(null, data)
}
})
}
Here's a rewrite of your code:
function getUserDetails(callback){
var params = {
TableName: 'userDetailTable',
IndexName: 'user_status-index',
KeyConditionExpression: 'user_status = :user_status',
FilterExpression: 'epoch_timestamp > :time_threshold_ms',
ExpressionAttributeValues: {
':user_status': 'active',
':time_threshold_ms': Date.now() - 86400000
}
};
docClient.query(params, function(err, data) {
if(err) {
callback(err, null)
} else{
callback(null, data)
}
})
}
Specifically, in the FilteExpression you cannot compute any date. Instead you should compare the item's epoch_timestamp attribute with time_threshold_ms which you compute once (for all items inspected by the query) at ExpressionAttributeValues
Please note though that you are can make this more efficient if you define a GSI which uses epoch_timestamp as its sort key (user_status can remain the partition key). Then, instead of placing the condition in the FilterExpression you will need to move it into KeyConditionExpression.
Also, when you use a FilterExpression you need to check the LastEvaluatedKey of the response. If it is not empty you need to issue a followup query with LastEvaluatedKey copied into the request's ExclusiveStartKey. Why? due to filtering it is possible that you will get no results from the "chunk" (or "page") examined by DDB. DDB only examines a single "chunck" at each query invocation. Issuing a followup query with ExclusiveStartKey will tell DDB to inspect the next "chunk".
(see https://dzone.com/articles/query-dynamodb-items-withnodejs for further details on that)
Alternatively, if you do not use filtering you are advised to use pass a Limit value in the request to tell DDB to stop after the desired number of items. However, if you do use filtering do not pass a Limit value as it will reduce the size of the "chunk" and you will need to do many more followup queries until you get your data.
You cannot perform a calculation in the filter expression but you can calculate it outside and use the result with a new inequality.
I think you are looking for items expiring after one day from now.
Something like
FilterExpression: 'expiration_time > :max_time',
ExpressionAttributeValues: {
':user_status': 'active',
':max_time': new Date().getTime() + 86400000 // 1 day in ms // i.e. one day from now.
}
DynamoDB is used to manage PATCH requests where 1 or more properties may be provided. I'd like those properties to be updated if they exist in the request, otherwise ignored in the update. DocumentClient.update(params) where params is:
TableName: '...',
Key: {...},
UpdateExpression: `set
Cost = :Cost,
Sales = :Sales,
...
ExpressionAttributeValues: {
':Cost': get(requestBody, 'form.cost', undefined),
':Sales': get(requestBody, 'form.sales', undefined),
...
}
Or is this achieving this only possible by manipulating the expression strings?
I feel like I'm using DynamoDB wrong for this multi-field PATCH, especially since the solution is overly complex.
Leaving this here in case anyone else finds it helpful, or better, has a tidier solution:
let fieldsToUpdate = [
// these are just string constants from another file
[dynamoDbFields.cost, apiFields.cost],
[dynamoDbFields.annualSales, apiFields.annualSales],
... ]
// get the new DynamoDB value from the request body (it may not exist)
.map(([dynamoDbField, apiField]) => [dynamoDbField, _.get(requestBody, apiField)])
// filter any keys that are undefined on the request body
.filter(([dynamoDbField, value]) => value !== undefined)
// create a mapping of the field identifier (positional index in this case) to the DynamoDB value, e.g. {':0': '123'}
let expressionAttributeValues = fieldsToUpdate.reduce((acc, [dynamoDbField, value], index) =>
_.assignIn(acc, {[`:${index}`]: value}), {})
// and create the reciprocal mapping of the identifier to the DynamoDB field name, e.g {ID: ':0'}
let updateExpression = fieldsToUpdate.reduce((acc, [dynamoDbField], index) =>
_.assignIn(acc, {[dynamoDbField]: `:${index}`}), {})
const params = {
TableName: TABLE_NAME,
Key: {[dynamoDbFields.id]: _.get(requestBody, apiFields.id)},
UpdateExpression: `set ${Object.entries(updateExpression).map((v) => v.join('=')).join(',')}`,
ExpressionAttributeValues: expressionAttributeValues,
ConditionExpression: `attribute_exists(${dynamoDbFields.id})`,
ReturnValues: 'ALL_NEW'
}
I'm trying to retrieve all items from a DynamoDB table that match a FilterExpression, and although all of the items are scanned and half do match, the expected items aren't returned.
I have the following in an AWS Lambda function running on Node.js 6.10:
var AWS = require("aws-sdk"),
documentClient = new AWS.DynamoDB.DocumentClient();
function fetchQuotes(category) {
let params = {
"TableName": "quotient-quotes",
"FilterExpression": "category = :cat",
"ExpressionAttributeValues": {":cat": {"S": category}}
};
console.log(`params=${JSON.stringify(params)}`);
documentClient.scan(params, function(err, data) {
if (err) {
console.error(JSON.stringify(err));
} else {
console.log(JSON.stringify(data));
}
});
}
There are 10 items in the table, one of which is:
{
"category": "ChuckNorris",
"quote": "Chuck Norris does not sleep. He waits.",
"uuid": "844a0af7-71e9-41b0-9ca7-d090bb71fdb8"
}
When testing with category "ChuckNorris", the log shows:
params={"TableName":"quotient-quotes","FilterExpression":"category = :cat","ExpressionAttributeValues":{":cat":{"S":"ChuckNorris"}}}
{"Items":[],"Count":0,"ScannedCount":10}
The scan call returns all 10 items when I only specify TableName:
params={"TableName":"quotient-quotes"}
{"Items":[<snip>,{"category":"ChuckNorris","uuid":"844a0af7-71e9-41b0-9ca7-d090bb71fdb8","CamelCase":"thevalue","quote":"Chuck Norris does not sleep. He waits."},<snip>],"Count":10,"ScannedCount":10}
You do not need to specify the type ("S") in your ExpressionAttributeValues because you are using the DynamoDB DocumentClient. Per the documentation:
The document client simplifies working with items in Amazon DynamoDB by abstracting away the notion of attribute values. This abstraction annotates native JavaScript types supplied as input parameters, as well as converts annotated response data to native JavaScript types.
It's only when you're using the raw DynamoDB object via new AWS.DynamoDB() that you need to specify the attribute types (i.e., the simple objects keyed on "S", "N", and so on).
With DocumentClient, you should be able to use params like this:
const params = {
TableName: 'quotient-quotes',
FilterExpression: '#cat = :cat',
ExpressionAttributeNames: {
'#cat': 'category',
},
ExpressionAttributeValues: {
':cat': category,
},
};
Note that I also moved the field name into an ExpressionAttributeNames value just for consistency and safety. It's a good practice because certain field names may break your requests if you do not.
I was looking for a solution that combined KeyConditionExpression with FilterExpression and eventually I worked this out.
Where aws is the uuid. Id is an assigned unique number preceded with the text 'form' so I can tell I have form data, optinSite is so I can find enquiries from a particular site. Other data is stored, this is all I need to get the packet.
Maybe this can be of help to you:
let optinSite = 'https://theDomainIWantedTFilterFor.com/';
let aws = 'eu-west-4:EXAMPLE-aaa1-4bd8-9ean-1768882l1f90';
let item = {
TableName: 'Table',
KeyConditionExpression: "aws = :Aw and begins_with(Id, :form)",
FilterExpression: "optinSite = :Os",
ExpressionAttributeValues: {
":Aw" : { S: aws },
":form" : { S: 'form' },
":Os" : { S: optinSite }
}
};
I understand that I can create "Lists" only from primitive data types, so look at my (Node.js using AWS Document Client) code as pseudo code. My objective is to attach a JSON array to an item so that I can later retrieve/update/delete the device (and corresponding data) from the customer's record. I understand I may be able to use Maps to do this, but I'm a beginner and the documentation regarding how to do that using document client is unclear to me.
This is what I am trying to do:
var deviceData = {
'deviceID': deviceID,
'attributes': [
{'firmwareVersion': firmwareVersion},
{'productID': productID},
{'knickName': 'New Device'},
{'dateAdded': (new Date()).getTime()}
]
};
var newCustomerData = {
TableName: process.env.customerMasterFile,
Key: {
'email': email
},
ReturnValues: 'UPDATED_NEW',
UpdateExpression: 'ADD #device :device SET #customerEmailDomain = :customerEmailDomain, #friendlyName = :friendlyName, #created = :created, #updated = :updated',
ExpressionAttributeNames: {
'#device': 'deviceList',
'#customerEmailDomain': 'customerEmaiDomain',
'#friendlyName': 'friendlyName',
'#created': 'createAccountTime',
'#updated': 'updateAccountTime',
},
ExpressionAttributeValues: {
':device': docClient.createSet([deviceData]), // I know this is incorrect...
':customerEmailDomain': customerEmailDomain,
':friendlyName': friendlyName,
':created': (new Date()).getTime(),
':updated': (new Date()).getTime()
}
};
docClient.update(newCustomerData, function(err, data) {
if (err) console.log(err);
else console.log(data);
});
Normally, JSON data will be persisted as Map on DynamoDB. If you store JSON array on DynamoDB, it will be stored as "List of Map" data type on DynamoDB which will make it difficult to update, delete, retrieve without knowing the index of the List data type (i.e. device). It is not recommended to use "List of Map" if you need to accomplish update/delete without knowing the index of list (i.e. index of an array).
1) Changed to SET for all attributes including device
To store single JSON object as Map which will allow to update/delete without knowing the index of an array:-
var params = {
TableName: process.env.customerMasterFile,
Key: {
'email': email
},
ReturnValues: 'UPDATED_NEW',
UpdateExpression: 'SET #device = :device, #customerEmailDomain = :customerEmailDomain ,#friendlyName = :friendlyName, #created = :created, #updated = :updated',
ExpressionAttributeNames: {
'#device': 'deviceList',
'#customerEmailDomain': 'customerEmaiDomain',
'#friendlyName': 'friendlyName',
'#created': 'createAccountTime',
'#updated': 'updateAccountTime',
},
ExpressionAttributeValues: {
':device': deviceData,
':customerEmailDomain': customerEmailDomain,
':friendlyName': friendlyName,
':created': (new Date()).getTime(),
':updated': (new Date()).getTime()
}
};
Sample device as Map:-
Alternate Approach:-
Add device id as sort key of the table
The attributes email and device id forms the unique combination for an item on DynamoDB
You can accomplish the update/delete easily with this data model
I try to get first 10 items which satisfy condition from DynamoDB using lambda AWS. I was trying to use Limit parameter but it is (basis on that website)
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB.html#scan-property
"maximum number of items to evaluate (not necessarily the number of matching items)".
How to get 10 first items which satisfy my condition?
var AWS = require('aws-sdk');
var db = new AWS.DynamoDB();
exports.handler = function(event, context) {
var params = {
TableName: "Events", //"StreamsLambdaTable",
ProjectionExpression: "ID, description, endDate, imagePath, locationLat, locationLon, #nm, startDate, #tp, userLimit", //specifies the attributes you want in the scan result.
FilterExpression: "locationLon between :lower_lon and :higher_lon and locationLat between :lower_lat and :higher_lat",
ExpressionAttributeNames: {
"#nm": "name",
"#tp": "type",
},
ExpressionAttributeValues: {
":lower_lon": {"N": event.low_lon},
":higher_lon": {"N": event.high_lon}, //event.high_lon}
":lower_lat": {"N": event.low_lat},
":higher_lat": {"N": event.high_lat}
}
};
db.scan(params, function(err, data) {
if (err) {
console.log(err); // an error occurred
}
else {
data.Items.forEach(function(record) {
console.log(
record.name.S + "");
});
context.succeed(data.Items);
}
});
};
I think you already know the reason behind this: the distinction that DynamoDB makes between ScannedCount and Count. As per this,
ScannedCount — the number of items that were queried or scanned,
before any filter expression was applied to the results.
Count — the
number of items that were returned in the response.
The fix for that is documented right above this:
For either a Query or Scan operation, DynamoDB might return a LastEvaluatedKey value if the operation did not return all matching items in the table. To get the full count of items that match, take the LastEvaluatedKey value from the previous request and use it as the ExclusiveStartKey value in the next request. Repeat this until DynamoDB no longer returns a LastEvaluatedKey value.
So, the answer to your question is: use the LastEvaluatedKey from DynamoDB response and Scan again.