I want to update my Object by adding a more key-value pair.
Object options = {
"first_name": "Nitish",
"last_name" : "Singh"
}
after initializing the Object I want to add one more key and value. Is there any way to do this.
after adding one more key-value pair my object will look like this
options = {
"first_name" : "Nitish",
"last_name" : "Singh"
"middle_name": "Kumar"
}
You can assign to a Map using the indexing operator
options['middle_name'] = 'Kumar';
{} is a Map literal to create a Map instance.
The result allows you to use all methods of Map like remove
Related
I have multiple fields on a single sObject that all use the same Global Value Set. I wan't to be able to perform an action, based on the value of these fields. If a new field were to be added, using the same GVS, I would not want to have to change my code. How can I tell which GVS a field uses?
You have to pull the metadata on the field definition via Tooling REST API. Resource URI:
/services/data/<api version>/tooling/query?q=SELECT+Id,Label,DurableId,Metadata+FROM+FieldDefinition+WHERE+DurableId='...'
yields this response with <api_version> = 46.0 (Ids are redacted, many fields are removed for clarity):
{
"size" : 1,
"totalSize" : 1,
"done" : true,
"queryLocator" : null,
"entityTypeName" : "FieldDefinition",
"records" : [ {
"attributes" : {
"type" : "FieldDefinition",
"url" : "/services/data/v46.0/tooling/sobjects/FieldDefinition/01I..."
},
"Id" : "000...",
"Label" : "State",
"DurableId" : "01I...00N...",
"QualifiedApiName" : "State__c",
"Metadata" : {
...
"valueSet" : {
"controllingField" : null,
"restricted" : true,
"valueSetDefinition" : null,
"valueSetName" : "usa_states",
"valueSettings" : null
},
....
}
} ]
}
If valueSet key is present and the child key valueSetName has a value other than null, that means this field is a picklist backed by a Global Value Set. The value of valueSetName is the name of the Global Value set.
To retrieve the Metadata field from FieldDefinition entity, your Tooling API query must return exactly 1 record. If more than 1 record would be returned, you'll see a misleading MALFORMED_QUERY error message.
To return exactly 1 record, querying on DurableId is one way to go. For standard objects, the value of DurableId is easy to construct - it's a concatenation in a format of <Object Name>.<Field Name>. For example, with the standard Account object, its Id field has a durable id value of Account.Id and the corresponding REST resource would be
/services/data/<api version>/tooling/query?q=SELECT+Id,Label,DurableId,Metadata+FROM+FieldDefinition+WHERE+DurableId='Account.Id'
For custom objects, the value of DurableId has the format of <object definition id>.<field definition id> (the period between two ids is intentional). Note: these are not Ids of records (e.g. instances) of objects and/or fields; they are Ids of object and field definitions. If we have a custom object Truck__c with an id of 01I... and a custom field State__c with an id of 00N..., the value of DurableId will be 01Ixxxxx.00Nyyyy.
As an alternative to figuring out the DurableId, if you need to pull the metadata of a custom field on a standard or a custom object, it's easier to go against CustomField entity via the Tooling API:
/services/data/<api version>/tooling/query?q=SELECT+Id,Metadata,DeveloperName+FROM+CustomField+WHERE+DeveloperName='State'
The value of DeveloperName is the same as the value of a Field Name field on the custom field's definition page in the admin UI.
I have a collection of product in which I have document like this
"_id" : ObjectId("5acb1dad698eaa7a254c9017"),
"txtProductCode" : "1233A",
"txtModelCode" : "00M",
"txtPartNo" : "00P",
"txtSerialNo" : "00S",
"txtProductName" : "Watch",
"traderId" : ObjectId("5ac5fb29b0f9b3444e6c1ef2")
I want to search a product based on its name and traderId for which I used
db.getCollection('product').find( {$and:[{'txtProductName':"Watch"},{"traderId" : ObjectId("5ac5fb29b0f9b3444e6c1ef2")}]})
its working fine but now if a user have input model no then it shoud use model number also to search for a product if the user have not input the model no then it should without model number
So My question is do I have to use cases like this
if(req.body.modelNo)
db.getCollection('product').find( {$and:[{'txtProductName':"Watch"},{"traderId" : ObjectId("5ac5fb29b0f9b3444e6c1ef2")},{'txtModelCode':"00M"}]})
else
db.getCollection('product').find( {$and:[{'txtProductName':"Watch"},{"traderId" : ObjectId("5ac5fb29b0f9b3444e6c1ef2")}]})
or is there a way to do this without making cases I have to do this for multiple condtions so I am trying not to use cases
Create the query object first then add the extra key with a conditional check. No need to explicitly use the $and operator when specifying a comma separated list of expressions as it's implicitly provided:
let query = {
'txtProductName': 'Watch',
'traderId': ObjectId('5ac5fb29b0f9b3444e6c1ef2')
};
if (req.body.modelNo) query['txtModelCode'] = req.body.modelNo;
db.getCollection('product').find(query);
If using the $and operator, you can push the additional query into an array then use the list for the $and operator:
let andOperator = [
{ 'txtProductName': 'Watch' },
{ 'traderId': ObjectId('5ac5fb29b0f9b3444e6c1ef2') }
];
if (req.body.modelNo) andOperator.push({ 'txtModelCode': req.body.modelNo });
// if (req.body.modelNo) andOperator = [...andOperator, { 'txtModelCode': req.body.modelNo }];
db.getCollection('product').find({ '$and': andOperator });
Well, I would have done this in this way
First, you should send a json of specific from to backend. for example
[{'txtModelCode':"00M"},{'txtPartNo':"AC"},{'Yts':"xyz"}]
OR
[{'txtModelCode':"00M"},{'txtPartNo':"AC"}]
OR
[{'txtModelCode':"00M"}]
This is the payload that you should expect in req.body. And finally you can use it in your find() criteria. Something like
db.getCollection('product').find( {$and:[{'txtProductName':"Watch"},
{"traderId" : ObjectId("5ac5fb29b0f9b3444e6c1ef2")}, ...req.body]})
... is called spread operator. Spread syntax allows an iterable such as an array expression or string to be expanded. Read more about it here
This will make it totally dynamic. Any scaling in collection can directly be used in find criteria. you never have to add extra line of code
I noticed that DynamoDB can add and remove items from an array but how do you search for an specific item inside an object if you want to update that one specifically?
For example:
In MongoDB you can search for someitem.$.subitem and update that specific item.
Is there a way on how to do this with DynamoDB?
Item: {
someitem: [
{
subitem: "id",
somevalue: "something"
}
]
}
I would say this is basic functionality but seems not easy to find (or even unsupported)
AWS does not permit to modify it in a single update request more info was found in the following answers:
updating-a-json-array-in-aws-dynamodb.
The solution that they propose is to change the schema from array to {}, or to implement a custom functions and iterate through each array and find your new id to update, so to speak to programatically update your json and then insert whole object.
TableName : 'tablename',
Key : { id: id},
ReturnValues : 'ALL_NEW',
UpdateExpression : 'set someitem['+`index`+'].somevalue = :reply_content',
ExpressionAttributeValues : { ':reply_content' : updateddata }
array element edit via array index
I would like to know how can we access nested array elements in MongoDB
For example, if we have something like :
{
array1 : [
{
array11 : {
name11 : "xyz"
}
},{
array12 : {
name12: "abc",
nums : [1,2,3,4]
}
}
]
}
Now how can I access and update the "name12" field.
And how can I add elements to "nums" field.
A) To find the collection : I am assuming you want to find name12 = "abc".
db.mydata.find({"array1.array12.name12":"abc"}).pretty()
B) To update & Add elements to the nums array use the positional operator :
https://docs.mongodb.com/manual/reference/operator/update/positional/
db.mydata.update({"array1.array12.name12":"abc"},
{$set:{"array1.$.name12":"abc"} ,
$push:{"array1.$.nums":5}
})
On a side note, you should consider redefining your schema so that the arrays have similar structures. It will help you to update in the long run..
you can access it like any traditional js array, in this case, youre accessing objects within arrays so take care of the syntax.
I'm trying to output a list of properties from a Mongoose object, but I get a lot of Javascript helper functions too. I'm wondering if there's a clean way to just output my Mongoose schema properties.
My Jade display code looks like:
h4 Legacy data
ul
- each val, key in d.old
li= key + ": " + val
And my Mongoose schema definition is
Entry = new Schema({
old : {
submitter : String,
table : String,
wordid : Number
}
});
But when the page is rendered, there are a bunch of other Javascript properties and functions that get outputted at the same time. e.g.
_scope: [object Object]
toObject: function () { return this.get(path); }
wordid: 2035
...
Is there an easy way to iterate just through the properties from my schema?
I could use a specified list but I was wondering if there was a nicer way.
Actually, how would I write the specified way? In ruby I know I could do [ 'wordid', 'submitter' ].each but is there an equivalent in Jade?
You're encountering the object's prototype properties. You can filter them out with .hasOwnProperty
- each val, key in d.old
- if(d.old.hasOwnProperty(key))
li= key + ": " + val
Remember that you can also use the method toJSON on the document (mongoose doc of Document#toJSON) to get a clean JSON object that can be used in your templates (without worrying about mongoose document's internals and methods). In fact, the toObject method you mentioned is similar to toJSON, you might wanna check it out.
For example:
doc = EntryModel({old: {submitter: "s", table: "tableS", wordid: "666"}})
console.log(b.toJSON())
// outputs:
{
"_id": "51fea037434b242816000002",
"old": {
"submitter": "s",
"table": "tableS",
"wordid": 666
}
}
// Is a plain JSON object without any other property or method