I am trying to update my existing record in my dynamodb table.
Like below I have an Item in my table
let params = {
TableName: proces.env.dynamoDbTable,
Item: {
productId: "id",
att1: val1,
att2: val2
}
}
I want to perform an update. I am using the aws dynamodb sdk's update method and passing it params like below
let aws = require('aws-sdk');
let dbb = new aws.DynamoDb.DocumentClient();
let params = {
TableName: process.env.tableName,
Key: {productID}
ExpressionAttributeNames: { "#updatedAt" : "updatedAt" }
ExpressionAttributeValues: {":u":moment().unix(), ":val1" : a, ":val2": b}
UpdateExpression: "SET att1 = :val1, att2: val2, #updatedAt: :u"
}
// a, b are passed as argument to function and are optional
dbb.update(params).promise()
When an argument goes missing the dynamo raises ExpressionAttributeValue missing exception and I know it is straight. Is there a way I can update my Item with the attributes provided
Unfortunately dynamodb does not make this easy. But you can use some fancy js to create the params object dynamically:
// for some object `attrs` we find which keys we need to update
const keys = Object.keys(attrs);
const values = Object.values(attrs);
// get a list of key names and value names to match the dynamodb syntax
const attributeKeyNames = keys.map((k) => '#key_' + k);
const attributeValueNames = keys.map((k) => ':val_' + k);
// create individual expressions for each attribute that needs to be updated
const expressions = attributeValueNames.map((attr, i) => {
return `${attributeKeyNames[i]} = ${attr}`;
});
// add the SET keyword to the beginning of the string
// and join all the expressions with a comma
const UpdateExpression = 'SET ' + expressions.join(', ');
// I use `zipObject()` from lodash https://lodash.com/docs/4.17.15#zipObject
// it makes an object map from two arrays where the first is the keys and
// the second is the values
const ExpressionAttributeValues = _.zipObject(attributeValueNames, values);
const ExpressionAttributeNames = _.zipObject(attributeKeyNames, keys);
// now you have all the params
const params = {
TableName,
Key: {
uuid,
},
UpdateExpression,
ExpressionAttributeValues,
ExpressionAttributeNames,
};
Lucas D's answer works great. A couple of things to keep in mind though:
If you're sending an object with a unique key into your update function, you must remove that key before mapping your expression arrays or it will be included in the query and will get an error:
const newObject = {...originalObject}
delete newObject.unique_key
If you can't or don't want to use Lodash, you can use Object.assign to map your keys/values arrays:
const ExpressionAttributeValues = Object.assign(...attributeValueNames.map((k, i) => ({[k]: values[i]})));
const ExpressionAttributeNames = Object.assign(...attributeKeyNames.map((k, i) => ({[k]: keys[i]})));
Related
I'm trying to perform a update call to a DynamoDB.DocumentClient instance using AWS SDK with the payload on the code snippet below:
const AWS = require('aws-sdk')
const DynamoDB = new AWS.DynamoDB.DocumentClient()
...
const TableName = 'MyTable'
const Key = { PK: 'MyPK', SK: 'MySK' }
const operation = 'DELETE'
const myId = 'abcde'
const currentRecord = await DynamoDB.get({TableName, Key)
DynamoDB.update({
TableName,
Key,
UpdateExpression: `
${operation} myIds :valuesToModify,
version :incrementVersionBy
`,
ConditionExpression: `version = :version`,
ExpressionAttributeValues: {
":version": currentRecord.version,
":incrementVersionBy": 1,
":valuesToModify": DynamoDB.createSet([myId])
}
})...
I get this error as result:
ERROR Invoke Error
{
"errorType":"Error",
"errorMessage":"ValidationException: Invalid UpdateExpression: Incorrect operand type for operator or function;
operator: DELETE, operand type: NUMBER, typeSet: ALLOWED_FOR_DELETE_OPERAND",
"stack":[...]
}
Interestingly, if operation is changed to ADD it works well.
Any clues that could be helpful to understand why ADD works and not DELETE and/or how to fix and/or yet alternative approaches compatible with this update operation are highly appreciated!
The only workaround possible here is not to use a DELETE operation, instead, you gotta query the item, find the index in the array you wish to delete, and remove it a REMOVE operation:
like in this case, arrayField contains an array of Users, and I want to delete by user's phoneNumber.
const dataStore = await dynamodb.get(queryParams).promise();
let i=0; //save the index
for(i = 0; i < dataStore.Item.myTable.length; i++){
if(dataStore.Item.arrayField[i].phone === phoneNumber)
{
break;
}
}
if(i < dataStore.Item.arrayField.length){
const updateStoreParams = {
TableName: tableName,
Key: storeTableKey,
UpdateExpression: `REMOVE arrayField[${i}]`,
}
await dynamodb.update(updateStoreParams).promise().catch((err) => {
console.log(err);
throw err;
});
}
It ended up being a semantic error I didn't pay attention to.
When ${operation} was ADD the version field of the UpdateExpression would work because it is a numeric increment.
When ${operation} was DELETE, the version didn't work because, as the error states it was Incorrect operand type for operator or function as it will only work for Removing Elements From a Set as per the docs.
The error was a bit misleading at first but when I tried to implement with other SDK I ended up with the same error then I tried to focus within the UpdateExpression part and found that I had to refactor to something like this in order to it to work:
// Notice below that I inject ADD if operation is DELETE and a comma otherwise
DynamoDB.update({
TableName,
Key,
UpdateExpression: `
${operation} socketIds :valuesToModify
${operation == 'DELETE' ? 'ADD' : ','} version :incrementVersionBy
`,
ConditionExpression: `version = :version`,
ExpressionAttributeValues: {
':version': channelRecord.version,
':incrementVersionBy': 1,
':valuesToModify': DynamoDB.createSet([socketId])
}
})
Hopefully it will become useful to others in the future!
I want to retrieve items from the table where an attribute is equal to a value I specify using dynomodb.
The SQL equivalent is.
SELECT * FROM tokens where type='app_access_token`
Code:
copnst db = new Dynamodb.DocumentClient(credentials);
const params = {
TableName: 'TOKEN',
Key: {
type: 'app_access_token'
}
};
const response = db.get(params).promise();
But I believe this will only let me get via primary key, is that right?
No SQL WorkBench Structure
DynamoDB's equivalent of SELECT * is the scan method. They are both scanning all of the table records (and hence, slow and expensive).
You can learn more about the scan method in Amazon Developer Guide or in this useful thread.
If you can, I would use a GSI with the attribute type as key, but if you still want to do it with scan method, here's how you do it:
const params = {
TableName : 'TOKEN',
FilterExpression : 'type = :type',
ExpressionAttributeValues : {':type' : 'app_access_token'}
};
const documentClient = new AWS.DynamoDB.DocumentClient();
const response = documentClient.scan(params).promise();
AWS JavaScript SDK - DocumentClient Scan()
Running a Node.js serverless backend through AWS.
Main objective: to filter and list all LOCAL jobs (table items) that included the available services and zip codes provided to the filter.
Im passing in multiple zip codes, and multiple available services.
data.radius would be an array of zip codes = to something like this:[ '93901', '93902', '93905', '93906', '93907', '93912', '93933', '93942', '93944', '93950', '95377', '95378', '95385', '95387', '95391' ]
data.availableServices would also be an array = to something like this ['Snow removal', 'Ice Removal', 'Salting', 'Same Day Response']
I am trying to make an API call that returns only items that have a matching zipCode from the array of zip codes provided by data.radius, and the packageSelected has a match of the array data.availableServices provided.
API CALL
import * as dynamoDbLib from "./libs/dynamodb-lib";
import { success, failure } from "./libs/response-lib";
export async function main(event, context) {
const data = JSON.parse(event.body);
const params = {
TableName: "jobs",
FilterExpression: "zipCode = :radius, packageSelected = :availableServices",
ExpressionAttributeValues: {
":radius": data.radius,
":availableServices": data.availableServices
}
};
try {
const result = await dynamoDbLib.call("query", params);
// Return the matching list of items in response body
return success(result.Items);
} catch (e) {
return failure({ status: false });
}
Do I need to map the array of zip codes and available services first for this to work?
Should I be using comparison operators?
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LegacyConditionalParameters.QueryFilter.html
Is a sort key value or partition key required to query and filter? (the table has a sort key and partition key but i would like to avoid using them in this call)
Im not 100% sure on how to go about this so if anyone could point me in the right direction that would be wonderful and greatly appreciated!!
I'm not sure what your dynamodb-lib refers to but here's an example of how you can scan for attribute1 in a given set of values and attribute2 in a different set of values. This uses the standard AWS JavaScript SDK, and specifically the high-level document client.
Note that you cannot use an equality (==) test here, you have to use an inclusion (IN) test. And you cannot use query, but must use scan.
const AWS = require('aws-sdk');
let dc = new AWS.DynamoDB.DocumentClient({'region': 'us-east-1'});
const data = {
radius: [ '93901', '93902', '93905', '93906', '93907', '93912', '93933', '93942', '93944', '93950', '95377', '95378', '95385', '95387', '95391' ],
availableServices: ['Snow removal', 'Ice Removal', 'Salting', 'Same Day Response'],
};
// These hold ExpressionAttributeValues
const zipcodes = {};
const services = {};
data.radius.forEach((zipcode, i) => {
zipcodes[`:zipcode${i}`] = zipcode;
})
data.availableServices.forEach((service, i) => {
services[`:services${i}`] = service;
})
// These hold FilterExpression attribute aliases
const zipcodex = Object.keys(zipcodes).toString();
const servicex = Object.keys(services).toString();
const params = {
TableName: "jobs",
FilterExpression: `zipCode IN (${zipcodex}) AND packageSelected IN (${servicex})`,
ExpressionAttributeValues : {...zipcodes, ...services},
};
dc.scan(params, (err, data) => {
if (err) {
console.log('Error', err);
} else {
for (const item of data.Items) {
console.log('item:', item);
}
}
});
DynamoDB is used to manage PATCH requests where 1 or more properties may be provided. I'd like those properties to be updated if they exist in the request, otherwise ignored in the update. DocumentClient.update(params) where params is:
TableName: '...',
Key: {...},
UpdateExpression: `set
Cost = :Cost,
Sales = :Sales,
...
ExpressionAttributeValues: {
':Cost': get(requestBody, 'form.cost', undefined),
':Sales': get(requestBody, 'form.sales', undefined),
...
}
Or is this achieving this only possible by manipulating the expression strings?
I feel like I'm using DynamoDB wrong for this multi-field PATCH, especially since the solution is overly complex.
Leaving this here in case anyone else finds it helpful, or better, has a tidier solution:
let fieldsToUpdate = [
// these are just string constants from another file
[dynamoDbFields.cost, apiFields.cost],
[dynamoDbFields.annualSales, apiFields.annualSales],
... ]
// get the new DynamoDB value from the request body (it may not exist)
.map(([dynamoDbField, apiField]) => [dynamoDbField, _.get(requestBody, apiField)])
// filter any keys that are undefined on the request body
.filter(([dynamoDbField, value]) => value !== undefined)
// create a mapping of the field identifier (positional index in this case) to the DynamoDB value, e.g. {':0': '123'}
let expressionAttributeValues = fieldsToUpdate.reduce((acc, [dynamoDbField, value], index) =>
_.assignIn(acc, {[`:${index}`]: value}), {})
// and create the reciprocal mapping of the identifier to the DynamoDB field name, e.g {ID: ':0'}
let updateExpression = fieldsToUpdate.reduce((acc, [dynamoDbField], index) =>
_.assignIn(acc, {[dynamoDbField]: `:${index}`}), {})
const params = {
TableName: TABLE_NAME,
Key: {[dynamoDbFields.id]: _.get(requestBody, apiFields.id)},
UpdateExpression: `set ${Object.entries(updateExpression).map((v) => v.join('=')).join(',')}`,
ExpressionAttributeValues: expressionAttributeValues,
ConditionExpression: `attribute_exists(${dynamoDbFields.id})`,
ReturnValues: 'ALL_NEW'
}
I have a user table with a field username. I need to write something equivalent to this in dynamo db: Select * from user where username in('a','b','c');
Adding more from code prosepective i have usernames in an array say var arr=['a','b','c'];
I so far tried this which is giving me zero result
this.dynamo.client.scanAsync({
TableName: this.dynamo.table('users'),
FilterExpression: 'username IN (:list)',
ExpressionAttributeValues: {
':list': arr.toString()
}
}).then((response) => {
console.log(response);
return {
userFriends: result.Item.friends
};
});
When I pass one element in array it give me result searching passed single element in user table but its not working with more than one element in array.
The individual users should be given as comma separated String variables. JavaScript array is equivalent to List in AWS DynamoDB data type. The DynamoDB can't compare the String data type in database with List attribute (i.e. Array in JavaScript).
var params = {
TableName : "Users",
FilterExpression : "username IN (:user1, :user2)",
ExpressionAttributeValues : {
":user1" : "john",
":user2" : "mike"
}
};
Construct the object from array for FilterExpression:-
Please refer the below code for forming the object dynamically based on Array value.
var titleValues = ["The Big New Movie 2012", "The Big New Movie"];
var titleObject = {};
var index = 0;
titleValues.forEach(function(value) {
index++;
var titleKey = ":titlevalue"+index;
titleObject[titleKey.toString()] = value;
});
var params = {
TableName : "Movies",
FilterExpression : "title IN ("+Object.keys(titleObject).toString()+ ")",
ExpressionAttributeValues : titleObject
};
Note:-
I don't think IN clause with 1000s of usernames is a good idea in terms of performance.