Related
I have an array looks like ['N300W150727', '123test123', '123test1234'] I want push it into array mongoDB
I used $push it adds array inside array
async updateSn(updateSn: UpdateSN) {
const { id, bindedSn } = updateSn;
return await this.userModel.updateOne(
{ id: id },
{
$push: {
bindedSn: bindedSn,
},
},
);
}
Result
bindedSn
:
Array
0
:
"123test123"
1
:
"123test1234"
2
:
Array
my questions are :
1 - How to spread an array inside in mongoDB I used the spread operator nothing happen
async updateSn(updateSn: UpdateSN) {
const { id, bindedSn } = updateSn;
return await this.userModel.updateOne(
{ id: id },
{
$push: {
bindedSn: [...bindedSn],
},
},
);
}
2 - How can I send item of the array item by item to the service
I guess what you want to do is to combine $push and $each
userModel.updateOne(
{ id: id },
{ $push: { bindedSn: { $each: bindedSn } } }
)
More from docs here
In MongoDB, is it possible to update the value of a field using the value from another field? The equivalent SQL would be something like:
UPDATE Person SET Name = FirstName + ' ' + LastName
And the MongoDB pseudo-code would be:
db.person.update( {}, { $set : { name : firstName + ' ' + lastName } );
The best way to do this is in version 4.2+ which allows using the aggregation pipeline in the update document and the updateOne, updateMany, or update(deprecated in most if not all languages drivers) collection methods.
MongoDB 4.2+
Version 4.2 also introduced the $set pipeline stage operator, which is an alias for $addFields. I will use $set here as it maps with what we are trying to achieve.
db.collection.<update method>(
{},
[
{"$set": {"name": { "$concat": ["$firstName", " ", "$lastName"]}}}
]
)
Note that square brackets in the second argument to the method specify an aggregation pipeline instead of a plain update document because using a simple document will not work correctly.
MongoDB 3.4+
In 3.4+, you can use $addFields and the $out aggregation pipeline operators.
db.collection.aggregate(
[
{ "$addFields": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}},
{ "$out": <output collection name> }
]
)
Note that this does not update your collection but instead replaces the existing collection or creates a new one. Also, for update operations that require "typecasting", you will need client-side processing, and depending on the operation, you may need to use the find() method instead of the .aggreate() method.
MongoDB 3.2 and 3.0
The way we do this is by $projecting our documents and using the $concat string aggregation operator to return the concatenated string.
You then iterate the cursor and use the $set update operator to add the new field to your documents using bulk operations for maximum efficiency.
Aggregation query:
var cursor = db.collection.aggregate([
{ "$project": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}}
])
MongoDB 3.2 or newer
You need to use the bulkWrite method.
var requests = [];
cursor.forEach(document => {
requests.push( {
'updateOne': {
'filter': { '_id': document._id },
'update': { '$set': { 'name': document.name } }
}
});
if (requests.length === 500) {
//Execute per 500 operations and re-init
db.collection.bulkWrite(requests);
requests = [];
}
});
if(requests.length > 0) {
db.collection.bulkWrite(requests);
}
MongoDB 2.6 and 3.0
From this version, you need to use the now deprecated Bulk API and its associated methods.
var bulk = db.collection.initializeUnorderedBulkOp();
var count = 0;
cursor.snapshot().forEach(function(document) {
bulk.find({ '_id': document._id }).updateOne( {
'$set': { 'name': document.name }
});
count++;
if(count%500 === 0) {
// Excecute per 500 operations and re-init
bulk.execute();
bulk = db.collection.initializeUnorderedBulkOp();
}
})
// clean up queues
if(count > 0) {
bulk.execute();
}
MongoDB 2.4
cursor["result"].forEach(function(document) {
db.collection.update(
{ "_id": document._id },
{ "$set": { "name": document.name } }
);
})
You should iterate through. For your specific case:
db.person.find().snapshot().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
Apparently there is a way to do this efficiently since MongoDB 3.4, see styvane's answer.
Obsolete answer below
You cannot refer to the document itself in an update (yet). You'll need to iterate through the documents and update each document using a function. See this answer for an example, or this one for server-side eval().
For a database with high activity, you may run into issues where your updates affect actively changing records and for this reason I recommend using snapshot()
db.person.find().snapshot().forEach( function (hombre) {
hombre.name = hombre.firstName + ' ' + hombre.lastName;
db.person.save(hombre);
});
http://docs.mongodb.org/manual/reference/method/cursor.snapshot/
Starting Mongo 4.2, db.collection.update() can accept an aggregation pipeline, finally allowing the update/creation of a field based on another field:
// { firstName: "Hello", lastName: "World" }
db.collection.updateMany(
{},
[{ $set: { name: { $concat: [ "$firstName", " ", "$lastName" ] } } }]
)
// { "firstName" : "Hello", "lastName" : "World", "name" : "Hello World" }
The first part {} is the match query, filtering which documents to update (in our case all documents).
The second part [{ $set: { name: { ... } }] is the update aggregation pipeline (note the squared brackets signifying the use of an aggregation pipeline). $set is a new aggregation operator and an alias of $addFields.
Regarding this answer, the snapshot function is deprecated in version 3.6, according to this update. So, on version 3.6 and above, it is possible to perform the operation this way:
db.person.find().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
I tried the above solution but I found it unsuitable for large amounts of data. I then discovered the stream feature:
MongoClient.connect("...", function(err, db){
var c = db.collection('yourCollection');
var s = c.find({/* your query */}).stream();
s.on('data', function(doc){
c.update({_id: doc._id}, {$set: {name : doc.firstName + ' ' + doc.lastName}}, function(err, result) { /* result == true? */} }
});
s.on('end', function(){
// stream can end before all your updates do if you have a lot
})
})
update() method takes aggregation pipeline as parameter like
db.collection_name.update(
{
// Query
},
[
// Aggregation pipeline
{ "$set": { "id": "$_id" } }
],
{
// Options
"multi": true // false when a single doc has to be updated
}
)
The field can be set or unset with existing values using the aggregation pipeline.
Note: use $ with field name to specify the field which has to be read.
Here's what we came up with for copying one field to another for ~150_000 records. It took about 6 minutes, but is still significantly less resource intensive than it would have been to instantiate and iterate over the same number of ruby objects.
js_query = %({
$or : [
{
'settings.mobile_notifications' : { $exists : false },
'settings.mobile_admin_notifications' : { $exists : false }
}
]
})
js_for_each = %(function(user) {
if (!user.settings.hasOwnProperty('mobile_notifications')) {
user.settings.mobile_notifications = user.settings.email_notifications;
}
if (!user.settings.hasOwnProperty('mobile_admin_notifications')) {
user.settings.mobile_admin_notifications = user.settings.email_admin_notifications;
}
db.users.save(user);
})
js = "db.users.find(#{js_query}).forEach(#{js_for_each});"
Mongoid::Sessions.default.command('$eval' => js)
With MongoDB version 4.2+, updates are more flexible as it allows the use of aggregation pipeline in its update, updateOne and updateMany. You can now transform your documents using the aggregation operators then update without the need to explicity state the $set command (instead we use $replaceRoot: {newRoot: "$$ROOT"})
Here we use the aggregate query to extract the timestamp from MongoDB's ObjectID "_id" field and update the documents (I am not an expert in SQL but I think SQL does not provide any auto generated ObjectID that has timestamp to it, you would have to automatically create that date)
var collection = "person"
agg_query = [
{
"$addFields" : {
"_last_updated" : {
"$toDate" : "$_id"
}
}
},
{
$replaceRoot: {
newRoot: "$$ROOT"
}
}
]
db.getCollection(collection).updateMany({}, agg_query, {upsert: true})
(I would have posted this as a comment, but couldn't)
For anyone who lands here trying to update one field using another in the document with the c# driver...
I could not figure out how to use any of the UpdateXXX methods and their associated overloads since they take an UpdateDefinition as an argument.
// we want to set Prop1 to Prop2
class Foo { public string Prop1 { get; set; } public string Prop2 { get; set;} }
void Test()
{
var update = new UpdateDefinitionBuilder<Foo>();
update.Set(x => x.Prop1, <new value; no way to get a hold of the object that I can find>)
}
As a workaround, I found that you can use the RunCommand method on an IMongoDatabase (https://docs.mongodb.com/manual/reference/command/update/#dbcmd.update).
var command = new BsonDocument
{
{ "update", "CollectionToUpdate" },
{ "updates", new BsonArray
{
new BsonDocument
{
// Any filter; here the check is if Prop1 does not exist
{ "q", new BsonDocument{ ["Prop1"] = new BsonDocument("$exists", false) }},
// set it to the value of Prop2
{ "u", new BsonArray { new BsonDocument { ["$set"] = new BsonDocument("Prop1", "$Prop2") }}},
{ "multi", true }
}
}
}
};
database.RunCommand<BsonDocument>(command);
MongoDB 4.2+ Golang
result, err := collection.UpdateMany(ctx, bson.M{},
mongo.Pipeline{
bson.D{{"$set",
bson.M{"name": bson.M{"$concat": []string{"$lastName", " ", "$firstName"}}}
}},
)
In MongoDB, is it possible to update the value of a field using the value from another field? The equivalent SQL would be something like:
UPDATE Person SET Name = FirstName + ' ' + LastName
And the MongoDB pseudo-code would be:
db.person.update( {}, { $set : { name : firstName + ' ' + lastName } );
The best way to do this is in version 4.2+ which allows using the aggregation pipeline in the update document and the updateOne, updateMany, or update(deprecated in most if not all languages drivers) collection methods.
MongoDB 4.2+
Version 4.2 also introduced the $set pipeline stage operator, which is an alias for $addFields. I will use $set here as it maps with what we are trying to achieve.
db.collection.<update method>(
{},
[
{"$set": {"name": { "$concat": ["$firstName", " ", "$lastName"]}}}
]
)
Note that square brackets in the second argument to the method specify an aggregation pipeline instead of a plain update document because using a simple document will not work correctly.
MongoDB 3.4+
In 3.4+, you can use $addFields and the $out aggregation pipeline operators.
db.collection.aggregate(
[
{ "$addFields": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}},
{ "$out": <output collection name> }
]
)
Note that this does not update your collection but instead replaces the existing collection or creates a new one. Also, for update operations that require "typecasting", you will need client-side processing, and depending on the operation, you may need to use the find() method instead of the .aggreate() method.
MongoDB 3.2 and 3.0
The way we do this is by $projecting our documents and using the $concat string aggregation operator to return the concatenated string.
You then iterate the cursor and use the $set update operator to add the new field to your documents using bulk operations for maximum efficiency.
Aggregation query:
var cursor = db.collection.aggregate([
{ "$project": {
"name": { "$concat": [ "$firstName", " ", "$lastName" ] }
}}
])
MongoDB 3.2 or newer
You need to use the bulkWrite method.
var requests = [];
cursor.forEach(document => {
requests.push( {
'updateOne': {
'filter': { '_id': document._id },
'update': { '$set': { 'name': document.name } }
}
});
if (requests.length === 500) {
//Execute per 500 operations and re-init
db.collection.bulkWrite(requests);
requests = [];
}
});
if(requests.length > 0) {
db.collection.bulkWrite(requests);
}
MongoDB 2.6 and 3.0
From this version, you need to use the now deprecated Bulk API and its associated methods.
var bulk = db.collection.initializeUnorderedBulkOp();
var count = 0;
cursor.snapshot().forEach(function(document) {
bulk.find({ '_id': document._id }).updateOne( {
'$set': { 'name': document.name }
});
count++;
if(count%500 === 0) {
// Excecute per 500 operations and re-init
bulk.execute();
bulk = db.collection.initializeUnorderedBulkOp();
}
})
// clean up queues
if(count > 0) {
bulk.execute();
}
MongoDB 2.4
cursor["result"].forEach(function(document) {
db.collection.update(
{ "_id": document._id },
{ "$set": { "name": document.name } }
);
})
You should iterate through. For your specific case:
db.person.find().snapshot().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
Apparently there is a way to do this efficiently since MongoDB 3.4, see styvane's answer.
Obsolete answer below
You cannot refer to the document itself in an update (yet). You'll need to iterate through the documents and update each document using a function. See this answer for an example, or this one for server-side eval().
For a database with high activity, you may run into issues where your updates affect actively changing records and for this reason I recommend using snapshot()
db.person.find().snapshot().forEach( function (hombre) {
hombre.name = hombre.firstName + ' ' + hombre.lastName;
db.person.save(hombre);
});
http://docs.mongodb.org/manual/reference/method/cursor.snapshot/
Starting Mongo 4.2, db.collection.update() can accept an aggregation pipeline, finally allowing the update/creation of a field based on another field:
// { firstName: "Hello", lastName: "World" }
db.collection.updateMany(
{},
[{ $set: { name: { $concat: [ "$firstName", " ", "$lastName" ] } } }]
)
// { "firstName" : "Hello", "lastName" : "World", "name" : "Hello World" }
The first part {} is the match query, filtering which documents to update (in our case all documents).
The second part [{ $set: { name: { ... } }] is the update aggregation pipeline (note the squared brackets signifying the use of an aggregation pipeline). $set is a new aggregation operator and an alias of $addFields.
Regarding this answer, the snapshot function is deprecated in version 3.6, according to this update. So, on version 3.6 and above, it is possible to perform the operation this way:
db.person.find().forEach(
function (elem) {
db.person.update(
{
_id: elem._id
},
{
$set: {
name: elem.firstname + ' ' + elem.lastname
}
}
);
}
);
I tried the above solution but I found it unsuitable for large amounts of data. I then discovered the stream feature:
MongoClient.connect("...", function(err, db){
var c = db.collection('yourCollection');
var s = c.find({/* your query */}).stream();
s.on('data', function(doc){
c.update({_id: doc._id}, {$set: {name : doc.firstName + ' ' + doc.lastName}}, function(err, result) { /* result == true? */} }
});
s.on('end', function(){
// stream can end before all your updates do if you have a lot
})
})
update() method takes aggregation pipeline as parameter like
db.collection_name.update(
{
// Query
},
[
// Aggregation pipeline
{ "$set": { "id": "$_id" } }
],
{
// Options
"multi": true // false when a single doc has to be updated
}
)
The field can be set or unset with existing values using the aggregation pipeline.
Note: use $ with field name to specify the field which has to be read.
Here's what we came up with for copying one field to another for ~150_000 records. It took about 6 minutes, but is still significantly less resource intensive than it would have been to instantiate and iterate over the same number of ruby objects.
js_query = %({
$or : [
{
'settings.mobile_notifications' : { $exists : false },
'settings.mobile_admin_notifications' : { $exists : false }
}
]
})
js_for_each = %(function(user) {
if (!user.settings.hasOwnProperty('mobile_notifications')) {
user.settings.mobile_notifications = user.settings.email_notifications;
}
if (!user.settings.hasOwnProperty('mobile_admin_notifications')) {
user.settings.mobile_admin_notifications = user.settings.email_admin_notifications;
}
db.users.save(user);
})
js = "db.users.find(#{js_query}).forEach(#{js_for_each});"
Mongoid::Sessions.default.command('$eval' => js)
With MongoDB version 4.2+, updates are more flexible as it allows the use of aggregation pipeline in its update, updateOne and updateMany. You can now transform your documents using the aggregation operators then update without the need to explicity state the $set command (instead we use $replaceRoot: {newRoot: "$$ROOT"})
Here we use the aggregate query to extract the timestamp from MongoDB's ObjectID "_id" field and update the documents (I am not an expert in SQL but I think SQL does not provide any auto generated ObjectID that has timestamp to it, you would have to automatically create that date)
var collection = "person"
agg_query = [
{
"$addFields" : {
"_last_updated" : {
"$toDate" : "$_id"
}
}
},
{
$replaceRoot: {
newRoot: "$$ROOT"
}
}
]
db.getCollection(collection).updateMany({}, agg_query, {upsert: true})
(I would have posted this as a comment, but couldn't)
For anyone who lands here trying to update one field using another in the document with the c# driver...
I could not figure out how to use any of the UpdateXXX methods and their associated overloads since they take an UpdateDefinition as an argument.
// we want to set Prop1 to Prop2
class Foo { public string Prop1 { get; set; } public string Prop2 { get; set;} }
void Test()
{
var update = new UpdateDefinitionBuilder<Foo>();
update.Set(x => x.Prop1, <new value; no way to get a hold of the object that I can find>)
}
As a workaround, I found that you can use the RunCommand method on an IMongoDatabase (https://docs.mongodb.com/manual/reference/command/update/#dbcmd.update).
var command = new BsonDocument
{
{ "update", "CollectionToUpdate" },
{ "updates", new BsonArray
{
new BsonDocument
{
// Any filter; here the check is if Prop1 does not exist
{ "q", new BsonDocument{ ["Prop1"] = new BsonDocument("$exists", false) }},
// set it to the value of Prop2
{ "u", new BsonArray { new BsonDocument { ["$set"] = new BsonDocument("Prop1", "$Prop2") }}},
{ "multi", true }
}
}
}
};
database.RunCommand<BsonDocument>(command);
MongoDB 4.2+ Golang
result, err := collection.UpdateMany(ctx, bson.M{},
mongo.Pipeline{
bson.D{{"$set",
bson.M{"name": bson.M{"$concat": []string{"$lastName", " ", "$firstName"}}}
}},
)
I have array 'pets': [{'fido': ['abc']} that is a embeded document. When I add a pet to the array, how can I check to see if that pet already exists? For instance, if I added fido again... how can I check if only fido exists and not add it? I was hoping I could use $addToSet but I only want to check part of the set(the pets name).
User.prototype.updatePetArray = function(userId, petName) {
userId = { _id: ObjectId(userId) };
return this.collection.findOneAndUpdate(userId,
{ $addToSet: { pets: { [petName]: [] } } },
{ returnOriginal: false,
maxTimeMS: QUERY_TIME });
Result of adding fido twice:
{u'lastErrorObject': {u'updatedExisting': True, u'n': 1}, u'ok': 1, u'value': {u'username': u'bob123', u'_id': u'56d5fc8381c9c28b3056f794', u'location': u'AT', u'pets': [{u'fido': []}]}}
{u'lastErrorObject': {u'updatedExisting': True, u'n': 1}, u'ok': 1, u'value': {u'username': u'bob123', u'_id': u'56d5fc8381c9c28b3056f794', u'location': u'AT', u'pets': [{u'fido': [u'abc']}, {u'fido': []}]}}
If there is always going to be "variable" content within each member of the "pets" array ( i.e petName as the key ) then $addToSet is not for you. At least not not at the array level where you are looking to apply it.
Instead you basically need an $exists test on the "key" of the document being contained in the array, then either $addToSet to the "contained" array of that matched key with the positional $ operator, or where the "key" was not matched then $push directly to the "pets" array, with the new inner content directly as the sole array member.
So if you can live with not returning the modified document, then "Bulk" operations are for you. In modern drivers with bulkWrite():
User.prototype.updatePetArray = function(userId, petName, content) {
var filter1 = { "_id": ObjectId(userId) },
filter2 = { "_id": ObjectId(userId) },
update1 = { "$addToSet": {} },
update2 = { "$push": { "pets": {} } };
filter1["pets." + petName] = { "$exists": true };
filter2["pets." + petName] = { "$exists": false };
var setter1 = {};
setter1["pets.$." + petName] = content;
update1["$addToSet"] = setter1;
var setter2 = {};
setter2[petName] = [content];
update2["$push"]["pets"] = setter2;
// Return the promise that yields the BulkWriteResult of both calls
return this.collection.bulkWrite([
{ "updateOne": {
"filter": filter1,
"update": update1
}},
{ "updateOne": {
"filter": filter2,
"update": update2
}}
]);
};
If you must return the modified document, then you are going to need to resolve each call and return the one that actually matched something:
User.prototype.updatePetArray = function(userId, petName, content) {
var filter1 = { "_id": ObjectId(userId) },
filter2 = { "_id": ObjectId(userId) },
update1 = { "$addToSet": {} },
update2 = { "$push": { "pets": {} } };
filter1["pets." + petName] = { "$exists": true };
filter2["pets." + petName] = { "$exists": false };
var setter1 = {};
setter1["pets.$." + petName] = content;
update1["$addToSet"] = setter1;
var setter2 = {};
setter2[petName] = [content];
update2["$push"]["pets"] = setter2;
// Return the promise that returns the result that matched and modified
return new Promise(function(resolve,reject) {
var operations = [
this.collection.findOneAndUpdate(filter1,update1,{ "returnOriginal": false}),
this.collection.findOneAndUpdate(filter2,update2,{ "returnOriginal": false})
];
// Promise.all runs both, and discard the null document
Promise.all(operations).then(function(result) {
resolve(result.filter(function(el) { return el.value != null } )[0].value);
},reject);
});
};
In either case this requires "two" update attempts where only "one" will actually succeed and modify the document, since only one of the $exists tests is going to be true.
So as an example of that first case, the "query" and "update" are resolving after interpolation as:
{
"_id": ObjectId("56d7b759e955e2812c6c8c1b"),
"pets.fido": { "$exists": true }
},
{ "$addToSet": { "pets.$.fido": "ccc" } }
And the second update as:
{
"_id": ObjectId("56d7b759e955e2812c6c8c1b"),
"pets.fido": { "$exists": false }
},
{ "$push": { "pets": { "fido": ["ccc"] } } }
Given varibles of:
userId = "56d7b759e955e2812c6c8c1b",
petName = "fido",
content = "ccc";
Personally I would not be naming keys like this, but rather change the structure to:
{
"_id": ObjectId("56d7b759e955e2812c6c8c1b"),
"pets": [{ "name": "fido", "data": ["abc"] }]
}
That makes the update statements easier, and without the need for variable interpolation into the key names. For example:
{
"_id": ObjectId(userId),
"pets.name": petName
},
{ "$addToSet": { "pets.$.data": content } }
and:
{
"_id": ObjectId(userId),
"pets.name": { "$ne": petName }
},
{ "$push": { "pets": { "name": petName, "data": [content] } } }
Which feels a whole lot cleaner and can actually use an "index" for matching, which of course $exists simply cannot.
There is of course more overhead if using .findOneAndUpdate(), since this is afterall "two" actual calls to the server for which you need to await a response as opposed to the Bulk method which is just "one".
But if you need the returned document ( option is the default in the driver anyway ) then either do that or similarly await the Promise resolve from the .bulkWrite() and then fetch the document via .findOne() after completion. Albeit that doing it via .findOne() after the modification would not truly be "atomic" and could possibly return the document "after" another similar modification was made, and not only in the state of that particular change.
N.B Also assuming that apart from the keys of the subdocuments in "pets" as a "set" that your other intention for the array contained was adding to that "set" as well via the additional content supplied to the function. If you just wanted to overwrite a value, then just apply $set instead of $addToSet and similarly wrap as an array.
But it sounds reasonable that the former was what you were asking.
BTW. Please clean up by horrible setup code in this example for the query and update objects in your actual code :)
As a self contained listing to demonstrate:
var async = require('async'),
mongodb = require('mongodb'),
MongoClient = mongodb.MongoClient;
MongoClient.connect('mongodb://localhost/test',function(err,db) {
var coll = db.collection('pettest');
var petName = "fido",
content = "bbb";
var filter1 = { "_id": 1 },
filter2 = { "_id": 1 },
update1 = { "$addToSet": {} },
update2 = { "$push": { "pets": {} } };
filter1["pets." + petName] = { "$exists": true };
filter2["pets." + petName] = { "$exists": false };
var setter1 = {};
setter1["pets.$." + petName] = content;
update1["$addToSet"] = setter1;
var setter2 = {};
setter2[petName] = [content];
update2["$push"]["pets"] = setter2;
console.log(JSON.stringify(update1,undefined,2));
console.log(JSON.stringify(update2,undefined,2));
function CleanInsert(callback) {
async.series(
[
// Clean data
function(callback) {
coll.deleteMany({},callback);
},
// Insert sample
function(callback) {
coll.insert({ "_id": 1, "pets": [{ "fido": ["abc"] }] },callback);
}
],
callback
);
}
async.series(
[
CleanInsert,
// Modify Bulk
function(callback) {
coll.bulkWrite([
{ "updateOne": {
"filter": filter1,
"update": update1
}},
{ "updateOne": {
"filter": filter2,
"update": update2
}}
]).then(function(res) {
console.log(JSON.stringify(res,undefined,2));
coll.findOne({ "_id": 1 }).then(function(res) {
console.log(JSON.stringify(res,undefined,2));
callback();
});
},callback);
},
CleanInsert,
// Modify Promise all
function(callback) {
var operations = [
coll.findOneAndUpdate(filter1,update1,{ "returnOriginal": false }),
coll.findOneAndUpdate(filter2,update2,{ "returnOriginal": false })
];
Promise.all(operations).then(function(res) {
//console.log(JSON.stringify(res,undefined,2));
console.log(
JSON.stringify(
res.filter(function(el) { return el.value != null })[0].value
)
);
callback();
},callback);
}
],
function(err) {
if (err) throw err;
db.close();
}
);
});
And the output:
{
"$addToSet": {
"pets.$.fido": "bbb"
}
}
{
"$push": {
"pets": {
"fido": [
"bbb"
]
}
}
}
{
"ok": 1,
"writeErrors": [],
"writeConcernErrors": [],
"insertedIds": [],
"nInserted": 0,
"nUpserted": 0,
"nMatched": 1,
"nModified": 1,
"nRemoved": 0,
"upserted": []
}
{
"_id": 1,
"pets": [
{
"fido": [
"abc",
"bbb"
]
}
]
}
{"_id":1,"pets":[{"fido":["abc","bbb"]}]}
Feel free to change to different values to see how different "sets" are applied.
Please try this one with string template, here is one example running under mongo shell
> var name = 'fido';
> var t = `pets.${name}`; \\ string temple, could parse name variable
> db.pets.find()
{ "_id" : ObjectId("56d7b5019ed174b9eae2b9c5"), "pets" : [ { "fido" : [ "abc" ]} ] }
With the following update command, it will not update it if the same pet name exists.
> db.pets.update({[t]: {$exists: false}}, {$addToSet: {pets: {[name]: []}}})
WriteResult({ "nMatched" : 0, "nUpserted" : 0, "nModified" : 0 })
If the pets document is
> db.pets.find()
{ "_id" : ObjectId("56d7b7149ed174b9eae2b9c6"), "pets" : [ { "fi" : [ "abc" ] } ] }
After update with
> db.pets.update({[t]: {$exists: false}}, {$addToSet: {pets: {[name]: []}}})
WriteResult({ "nMatched" : 1, "nUpserted" : 0, "nModified" : 1 })
The result shows add the pet name if it does Not exist.
> db.pets.find()
{ "_id" : ObjectId("56d7b7149ed174b9eae2b9c6"), "pets" : [ { "fi" : [ "abc" ] }, { "fido" : [ ] } ] }
I'm trying to prepare a pre-aggregated data set from a log file for later analysis
for example, I have a log file such as this
2016-01-01 11:13:06 -0900 alphabetical|a
2016-01-01 11:20:16 -0900 alphabetical|a
2016-01-01 11:21:52 -0900 alphabetical|b
The data (after data/time/timezone) is split on a pipe
entry|detail
I'm creating a data set that has a separate document for each year-month and entry
my data as a result looks like this : https://jsonblob.com/56a7d7d8e4b01190df4b8a55
{
"action":"alphabetical",
"date":"2016-0",
"detail":{
"a":{
"daily":{
"1":5,
"2":4,
"3":5
},
"monthly":14
},
"b":{
"daily":{
"1":5,
"2":5,
"3":2
},
"monthly":12
},
"c":{
"daily":{
"1":2,
"2":2,
"3":2
},
"monthly":6
},
"d":{
"daily":{
"3":1
},
"monthly":1
}
},
"monthly":33,
"daily":{
"1":12,
"2":11,
"3":10
},
"dow":{
"0":10,
"5":12,
"6":11
}
}
by using
var logHit = function(data, callback){
var update = {};
var inc = {};
var detail = data.data.info[1];
inc['detail.'+escape(detail)+'.daily.'+data.date.d] = 1;
inc['detail.'+escape(detail)+'.monthly'] = 1;
inc['monthly'] = 1;
inc['daily.'+data.date.d] = 1;
inc['dow.'+data.date.dow] = 1;
update['$inc'] = inc;
collection.update(
{
directory_id: data.directory_id,
date: data.date.y+'-'+data.date.m,
action: data.data.info[0],
},
update,
{upsert: true},
function(error, result){
assert.equal(error, null);
assert.equal(1, result.result.n);
callback();
});
}
while the data that I'm looking to store is included, working with it as a object series makes it harder to process when it is retrieved. I'm using d3.js and having to convert objects to arrays.
How do I store the data in arrays instead of objects like this https://jsonblob.com/56a7da76e4b01190df4b8a74
{
"action":"alphabetical",
"date":"2016-0",
"detail":[
{
"name":"a",
"daily":[
{
"count":5
},
{
"count":4
},
{
"count":5
}
],
"monthly":14
},
{
"name":"b",
"daily":[
{
"count":5
},
{
"count":5
},
{
"count":2
}
],
"monthly":12
},
{
"name":"c",
"daily":[
{
"count":2
},
{
"count":2
},
{
"count":2
}
],
"monthly":6
},
{
"name":"d",
"daily":[
{
},
{
},
{
"count":1
}
],
"monthly":1
}
],
"monthly":33,
"daily":{
"1":12,
"2":11,
"3":10
},
"dow":{
"0":10,
"5":12,
"6":11
}
}
where the objects become part of an array, and the key instead is put inside an array, similar this answer https://stackoverflow.com/a/30751981/197546
In MongoDB, array documents can be referenced by index, but not by value. For instance, in your target data model you can change the name value of the first array element with the update argument:
{ $set: { "detail.0.name" : "me" }
Or even increment a deeply nested value like:
{ $inc: { "detail.0.daily.0.count" : 1 }
But in both cases knowing the index is necessary, which doesn't seem like that would work for your use case.
I would recommend referencing the docs on Array Update operators as well.