Add unique value to every element in array - node.js
I'm fairly new to MongoDB and I'm trying to merge an embedded array in a MongoDB collection, my schema for my Project collection is as follows:
Projects:
{
_id: ObjectId(),
client_id: String,
description: String,
samples: [
{
location: String, //Unique
name: String,
}
...
]
}
A user can upload a JSON file that is in the form of:
[
{
location: String, //Same location as in above schema
concentration: float
}
...
]
The length of the samples array is the same length as the uploaded data array. I'm trying to figure out how to add the data field into every element of my samples array, but I can't find out how to do it based on MongoDB documentation. I can load my json data in as "data" and I want to merge based on the common "location" field:
db.projects.update({_id: myId}, {$set : {samples.$[].data : data[location]}});
But I can't think of how to get the index on the json array in update query, and I haven't been able to find any examples in the mongodb documentation, or questions like this.
Any help would be much appreciated!
MongoDB 3.6 Positional Filtered Updates
So you're actually in the right "ballpark" with the positional all $[] operator, but the problem is that just simply applies to "every" array element. Since what you want is "matched" entries you actually want the positional filtered $[<identifier>] operator instead.
As you note your "location" is going to be unique and within the array. Using "index positions" is really not reliable for atomic updates, but actually matching the "unique" properties is. Basically you need to get from something like this:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
To this:
{
"$set": {
"samples.$[l0].concentration": 3,
"samples.$[l0].other": "c",
"samples.$[l1].concentration": 4,
"samples.$[l1].other": "a"
},
"arrayFilters": [
{
"l0.location": "A"
},
{
"l1.location": "C"
}
]
}
And that really is just a matter of applying some basic functions to the provided input array:
let arrayFilters = input.map(({ location },i) => ({ [`l${i}.location`]: location }));
let $set = input.reduce((o,{ location, ...e },i) =>
({
...o,
...Object.entries(e).reduce((oe,[k,v]) => ({ ...oe, [`samples.$[l${i}].${k}`]: v }),{})
}),
{}
);
log({ $set, arrayFilters });
The Array.map() simply takes the values of the input and creates a list of identifiers to match the location values within arrayFilters. The construction of the $set statement uses Array.reduce() with two iterations being able to merge keys for each array element processed and for each key present in that array element, after removing the location from consideration since this is not being updated.
Alternately, loop with for..of:
let arrayFilters = [];
let $set = {};
for ( let [i, { location, ...e }] of Object.entries(input) ) {
arrayFilters.push({ [`l${i}.location`]: location });
for ( let [k,v] of Object.entries(e) ) {
$set[`samples.$[l${i}].${k}`] = v;
}
}
Note we use Object.entries() here as well as the "object spread" ... in construction. If you find yourself in a JavaScript environment without this support, then Object.keys() and Object.assign() are basically drop in replacements with little change.
Then those can actually be applied within an update as in:
Project.update({ client_id: 'ClientA' }, { $set }, { arrayFilters });
So the positional filtered $[<identifier>] is actually used here to create "matching pairs" of entries within the $set modifier and within the arrayFilters option of the update(). So for each "location" we create an identifier that matches that value within the arrayFilters and then use that same identifier within the actual $set statement in order to just update the array entry which matches the condition for the identifier.
The only real rule with "identifiers" is that that cannot start with a number, and they "should" be unique but it's not a rule and you simply get the first match anyway. But the updates then only touch those entries which actually match the condition.
Ealier MongoDB fixed Indexes
Failing having support for that, then you are basically falling back to "index positions" and that's really not that reliable. More often than not you will actually need to read each document and determine what is in the array already before even updating. But with at least presumed "parity" where index positions are in place then:
let input = [
{ location: "A", concentration: 3 },
{ location: "B", concentration: 5 },
{ location: "C", concentration: 4 }
];
let $set = input.reduce((o,e,i) =>
({ ...o, [`samples.${i}.concentration`]: e.concentration }),{}
);
log({ $set });
Producing an update statement like:
{
"$set": {
"samples.0.concentration": 3,
"samples.1.concentration": 5,
"samples.2.concentration": 4
}
}
Or without the parity:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
// Need to get the document to compare without parity
let doc = await Project.findOne({ "client_id": "ClientA" });
let $set = input.reduce((o,e,i) =>
({
...o,
...Object.entries(e).filter(([k,v]) => k !== "location")
.reduce((oe,[k,v]) =>
({
...oe,
[`samples.${doc.samples.map(c => c.location).indexOf(e.location)}`
+ `.${k}`]: v
}),
{}
)
}),
{}
);
log({ $set });
await Project.update({ client_id: 'ClientA' },{ $set });
Producing the statement matching on the indexes ( after you actually read the document ):
{
"$set": {
"samples.0.concentration": 3,
"samples.0.other": "c",
"samples.2.concentration": 4,
"samples.2.other": "a"
}
}
Noting of course that for each "update set" you really don't have any other option than to read from the document first to determine which indexes you will update. This generally is not a good idea as aside from the overhead of needing to read each document before a write, there is no absolute guarantee that the array itself remains unchanged by other processes in between the read and the write, so using a "hard index" is making the presumption that everything is still the same, when that may not actually be the case.
Earlier MongoDB positional matches
Where data permits it's generally better to cycle standard positional matched $ updates instead. Here location is indeed unique so it's a good candidate, and most importantly you do not need read the existing documents to compare arrays for indexes:
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let batch = input.map(({ location, ...e }) =>
({
updateOne: {
filter: { client_id: "ClientA", 'samples.location': location },
update: {
$set: Object.entries(e)
.reduce((oe,[k,v]) => ({ ...oe, [`samples.$.${k}`]: v }), {})
}
}
})
);
log({ batch });
await Project.bulkWrite(batch);
A bulkWrite() sends multiple update operations, but it does so with a single request and response just like any other update operation. Indeed if you are processing a "list of changes" then returning the document for comparison of each and then constructing one big bulkWrite() is the direction to go in instead of individual writes, and that actually even applies to all previous examples as well.
The big difference is "one update instruction per array element" in the change set. This is the safe way to do things in releases without "positional filtered" support, even if it means more write operations.
Demonstration
A full listing in demonstration follows. Note I'm using "mongoose" here for simplicity, but there is nothing really "mongoose specific" about the actual updates themselves. The same applies to any implementation, and particular in this case the JavaScript examples of using Array.map() and Array.reduce() to process the list for construction.
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let arrayFilters = input.map(({ location },i) => ({ [`l${i}.location`]: location }));
let $set = input.reduce((o,{ location, ...e },i) =>
({
...o,
...Object.entries(e).reduce((oe,[k,v]) => ({ ...oe, [`samples.$[l${i}].${k}`]: v }),{})
}),
{}
);
log({ $set, arrayFilters });
await Project.update(
{ client_id: 'ClientA' },
{ $set },
{ arrayFilters }
);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And the output for those who cannot be bothered to run, shows the matching array elements updated:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b1778605c59470ecaf10fac"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b1778605c59470ecaf10faf"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b1778605c59470ecaf10fae"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b1778605c59470ecaf10fad"), location: 'C', name: 'Location C' } ], __v: 0 })
{
"$set": {
"samples.$[l0].concentration": 3,
"samples.$[l0].other": "c",
"samples.$[l1].concentration": 4,
"samples.$[l1].other": "a"
},
"arrayFilters": [
{
"l0.location": "A"
},
{
"l1.location": "C"
}
]
}
Mongoose: projects.update({ client_id: 'ClientA' }, { '$set': { 'samples.$[l0].concentration': 3, 'samples.$[l0].other': 'c', 'samples.$[l1].concentration': 4, 'samples.$[l1].other': 'a' } }, { arrayFilters: [ { 'l0.location': 'A' }, { 'l1.location': 'C' } ] })
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b1778605c59470ecaf10fac",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b1778605c59470ecaf10faf",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b1778605c59470ecaf10fae",
"location": "B",
"name": "Location B"
},
{
"_id": "5b1778605c59470ecaf10fad",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}
Or by hard index:
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
// Need to get the document to compare without parity
let doc = await Project.findOne({ "client_id": "ClientA" });
let $set = input.reduce((o,e,i) =>
({
...o,
...Object.entries(e).filter(([k,v]) => k !== "location")
.reduce((oe,[k,v]) =>
({
...oe,
[`samples.${doc.samples.map(c => c.location).indexOf(e.location)}`
+ `.${k}`]: v
}),
{}
)
}),
{}
);
log({ $set });
await Project.update(
{ client_id: 'ClientA' },
{ $set },
);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And the output:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b1778e0f7be250f2b7c3fc8"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b1778e0f7be250f2b7c3fcb"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b1778e0f7be250f2b7c3fca"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b1778e0f7be250f2b7c3fc9"), location: 'C', name: 'Location C' } ], __v: 0 })
Mongoose: projects.findOne({ client_id: 'ClientA' }, { fields: {} })
{
"$set": {
"samples.0.concentration": 3,
"samples.0.other": "c",
"samples.2.concentration": 4,
"samples.2.other": "a"
}
}
Mongoose: projects.update({ client_id: 'ClientA' }, { '$set': { 'samples.0.concentration': 3, 'samples.0.other': 'c', 'samples.2.concentration': 4, 'samples.2.other': 'a' } }, {})
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b1778e0f7be250f2b7c3fc8",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b1778e0f7be250f2b7c3fcb",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b1778e0f7be250f2b7c3fca",
"location": "B",
"name": "Location B"
},
{
"_id": "5b1778e0f7be250f2b7c3fc9",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}
And of course with standard "positional" $ syntax and updates:
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
const sampleSchema = new Schema({
location: String,
name: String,
concentration: Number,
other: String
});
const projectSchema = new Schema({
client_id: String,
description: String,
samples: [sampleSchema]
});
const Project = mongoose.model('Project', projectSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
await Project.create({
client_id: "ClientA",
description: "A Client",
samples: [
{ location: "A", name: "Location A" },
{ location: "B", name: "Location B" },
{ location: "C", name: "Location C" }
]
});
let input = [
{ location: "A", concentration: 3, other: "c" },
{ location: "C", concentration: 4, other: "a" }
];
let batch = input.map(({ location, ...e }) =>
({
updateOne: {
filter: { client_id: "ClientA", 'samples.location': location },
update: {
$set: Object.entries(e)
.reduce((oe,[k,v]) => ({ ...oe, [`samples.$.${k}`]: v }), {})
}
}
})
);
log({ batch });
await Project.bulkWrite(batch);
let project = await Project.findOne();
log(project);
mongoose.disconnect();
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
And output:
Mongoose: projects.remove({}, {})
Mongoose: projects.insertOne({ _id: ObjectId("5b179142662616160853ba4a"), client_id: 'ClientA', description: 'A Client', samples: [ { _id: ObjectId("5b179142662616160853ba4d"), location: 'A', name: 'Location A' }, { _id: ObjectId("5b179142662616160853ba4c"), location: 'B', name: 'Location B' }, { _id: ObjectId("5b179142662616160853ba4b"), location: 'C', name: 'Location C' } ], __v: 0 })
{
"batch": [
{
"updateOne": {
"filter": {
"client_id": "ClientA",
"samples.location": "A"
},
"update": {
"$set": {
"samples.$.concentration": 3,
"samples.$.other": "c"
}
}
}
},
{
"updateOne": {
"filter": {
"client_id": "ClientA",
"samples.location": "C"
},
"update": {
"$set": {
"samples.$.concentration": 4,
"samples.$.other": "a"
}
}
}
}
]
}
Mongoose: projects.bulkWrite([ { updateOne: { filter: { client_id: 'ClientA', 'samples.location': 'A' }, update: { '$set': { 'samples.$.concentration': 3, 'samples.$.other': 'c' } } } }, { updateOne: { filter: { client_id: 'ClientA', 'samples.location': 'C' }, update: { '$set': { 'samples.$.concentration': 4, 'samples.$.other': 'a' } } } } ], {})
Mongoose: projects.findOne({}, { fields: {} })
{
"_id": "5b179142662616160853ba4a",
"client_id": "ClientA",
"description": "A Client",
"samples": [
{
"_id": "5b179142662616160853ba4d",
"location": "A",
"name": "Location A",
"concentration": 3,
"other": "c"
},
{
"_id": "5b179142662616160853ba4c",
"location": "B",
"name": "Location B"
},
{
"_id": "5b179142662616160853ba4b",
"location": "C",
"name": "Location C",
"concentration": 4,
"other": "a"
}
],
"__v": 0
}
Related
Update MongoDb data inside a field
I have a MongoDb Database with a collection called rooms. In rooms, I want to search for a particular object by the roomId property. I want to update the array contents in the found object. For instance, initially, before making a request to that endpoint, the desired data looks like this: { available: true, _id: 60817a403170bf49185c7db7, player1: "Jack", player2: "Adam", roomId: "ABCDE", pieces: [ { point: [6, 0], player: "1", type: "P" }, ... ], __v: 0 } After making a request like http://localhost:8000/room/ABCDE?x1=6&y1=0&x2=4&y2=0, the data in mongodb should update to { available: true, _id: 60817a403170bf49185c7db7, player1: "Jack", player2: "Adam", roomId: "ABCDE", pieces: [ { point: [4, 0], /* DATA UPDATED */ player: "1", type: "P" }, ... ], __v: 0 } This is my api.js file, what should the code be in the comments in order to execute this? const express = require('express'); const GameRoom = require('../models/room'); const router = express.Router(); router.put('/room/:id', (req, res, next) => { GameRoom.findOne({roomId: req.params.id}) .then(data => { const pieces = data.pieces; for (let i = 0; i < pieces.length; i++) { if (pieces[i].point[0] === parseInt(req.query.x1) && pieces[i].point[1] === parseInt(req.query.y1)) { pieces[i].point = [req.query.x2, req.query.y2]; break; } } // Take the modified pieces array and update it }).catch(next); });
You can use $[<identifier>] to find and update the points: GameRoom.updateOne( { roomId: req.params.id }, { $set: { "pieces.$[el].point.0": req.query.x2, "pieces.$[el].point.1": req.query.y2 }}, { arrayFilters: [ { "el.point.0": req.query.x1, "el.point.1": req.query.y1 } ]} ); Or if you just want to update the first pieces that match the condition then you can use $elemMatch with $ operator: GameRoom.updateOne( { roomId: req.params.id, pieces: {$elemMatch: {"point.0": req.query.x1, "point.1": req.query.y1}}} }, { $set: {"pieces.$.point.0": req.query.x2, "pieces.$.point.1": req.query.y2} } );
What is the best approach for referencing objectIds plus additional parameters in a schema?
I am creating a schema for an Order model that will track the items ordered along with the quantity purchased. I want to keep the itemId references and the quantity tied together as an array in one parameter. I have created an Array that includes a reference to the ObjectId plus an additional Number type. I am currently unable to populate the product information using a .populate() query. Order Schema const mongoose = require("mongoose"); const { Schema } = mongoose; const orderSchema = new Schema({ orderNumber: String, _itemsOrdered: [ { itemId: { type: mongoose.Schema.Types.ObjectId, ref: "menuItems" }, quantity: Number } ] }); mongoose.model("orders", orderSchema); MenuItem Schema const mongoose = require("mongoose"); const { Schema } = mongoose; const MenuItemSchema = new Schema({ imageURL: String, name_en: String, name_es: String, type_en: String, type_es: String, description_en: String, description_es: String, dietaryCallouts: [String], price: Number }); mongoose.model("menuItems", MenuItemSchema); module.export = MenuItemSchema; I am able to save the record but cannot populate the MenuItem information with the following query: Order Controller async show(req, res, next) { try { const orderId = req.params.id; let order = await Order.findById({ _id: orderId }).populate( "_itemsOrdered.itemId" ); res.send(order); } catch (err) { res.status(402).send(err); } } Here it the order object that is being saved to the DB. Order Object { "_id": "5dc93b9c0085b8045e0c8aa3", "orderNumber": "Order 3", "_itemsOrdered": [ { "_id": "5dc93b9c0085b8045e0c8aa5", "itemId": "5dc7f814a2679b47319a79a4", "quantity": 1 }, { "_id": "5dc93b9c0085b8045e0c8aa4", "itemId": "5dc7e5c7de590744c46f93da", "quantity": 2 } ], "__v": 0 }
Your order schema must be like this: const orderSchema = new Schema({ orderNumber: String, _itemsOrdered: [ { itemId: { type: mongoose.Schema.Types.ObjectId, ref: "menuItems" }, quantity: Number } ] }); And you can use the following route to create an order document. router.post("/order", async (req, res, next) => { try { const { orderNumber, _itemsOrdered } = req.body; let order = new Order({ orderNumber, _itemsOrdered }); order = await order.save(); res.status(201).send(order); } catch (err) { console.log(err); res.status(500).send(err); } }); Sample body: (you need to change ids according to yours) { "orderNumber": "Order 1", "_itemsOrdered": [ {"itemId": "5dc90346222b892434e4675a", "quantity" : 1 }, {"itemId": "5dc90359222b892434e4675b", "quantity" : 2 } ] } To get the order and its items you can use populate like this: router.get("/orders/:id", async (req, res) => { try { const orderAndItems = await Order.findById(req.params.id).populate( "_itemsOrdered.itemId" ); res.send(orderAndItems); } catch (err) { console.log(err); res.status(500).send(err); } }); This will give you a result like this: { "_id": "5dc904db8407a217b4dfe6f4", "orderNumber": "Order 1", "_itemsOrdered": [ { "_id": "5dc904db8407a217b4dfe6f6", "itemId": { "_id": "5dc90346222b892434e4675a", "name_en": "item 1", "price": 1, "__v": 0 }, "quantity": 1 }, { "_id": "5dc904db8407a217b4dfe6f5", "itemId": { "_id": "5dc90359222b892434e4675b", "name_en": "item 2", "price": 2, "__v": 0 }, "quantity": 2 } ], "__v": 0 }
How to insert multiple JSON document in elastic search
Input Data [{ "_index": "abc", "_type": "_doc", "_id": "QAE", "_score": 6.514091, "_source": { "category": "fruits", "action": "eating", "metainfo": { "hash": "nzUZ1ONm0e167p" }, "createddate": "2019-10-03T12:37:45.297Z" }}, { "_index": "abc", "_type": "_doc", "_id": "PQR", "_score": 6.514091, "_source": { "category": "Vegetables", "action": "eating", "metainfo": { "hash": "nzUZ1ONm0e167p" }, "createddate": "2019-10-03T12:37:45.297Z" } }----------------- ----------------] I have around 30,000 records as input data. How to insert this data in a single query. I tried by var elasticsearch = require('elasticsearch'); var client = new elasticsearch.Client({ host: '********', log: 'trace' }); client.index({ index: "abc", body: ****input data***** }).then((res) => { console.log(res); }, (err) => { console.log("err", err); }); In this code, send input data in the body. but it returns an error. Please suggest to me.
This seems like what are you looking for: 'use strict' require('array.prototype.flatmap').shim() const { Client } = require('#elastic/elasticsearch') const client = new Client({ node: 'http://localhost:9200' }) async function run () { await client.indices.create({ index: 'tweets', body: { mappings: { properties: { id: { type: 'integer' }, text: { type: 'text' }, user: { type: 'keyword' }, time: { type: 'date' } } } } }, { ignore: [400] }) const dataset = [{ id: 1, text: 'If I fall, don\'t bring me back.', user: 'jon', date: new Date() }, { id: 2, text: 'Winter is coming', user: 'ned', date: new Date() }, { id: 3, text: 'A Lannister always pays his debts.', user: 'tyrion', date: new Date() }, { id: 4, text: 'I am the blood of the dragon.', user: 'daenerys', date: new Date() }, { id: 5, // change this value to a string to see the bulk response with errors text: 'A girl is Arya Stark of Winterfell. And I\'m going home.', user: 'arya', date: new Date() }] // The major part is below: const body = dataset.flatMap(doc => [{ index: { _index: 'tweets' } }, doc]) const { body: bulkResponse } = await client.bulk({ refresh: true, body }) // if (bulkResponse.errors) { const erroredDocuments = [] // The items array has the same order of the dataset we just indexed. // The presence of the `error` key indicates that the operation // that we did for the document has failed. bulkResponse.items.forEach((action, i) => { const operation = Object.keys(action)[0] if (action[operation].error) { erroredDocuments.push({ // If the status is 429 it means that you can retry the document, // otherwise it's very likely a mapping error, and you should // fix the document before to try it again. status: action[operation].status, error: action[operation].error, operation: body[i * 2], document: body[i * 2 + 1] }) } }) console.log(erroredDocuments) } const { body: count } = await client.count({ index: 'tweets' }) console.log(count) } run().catch(console.log) Reference link: https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/bulk_examples.html
Update the same property of every document of a mongoDb collection with different values
I have a collection in mongoDb which looks like this { "slno" : NumberInt(1), "name" : "Item 1" } { "slno" : NumberInt(2), "name" : "Item 2" } { "slno" : NumberInt(3), "name" : "Item 3" } I am receiving a request from angularJs frontend to update this collection to { "slno" : NumberInt(1), "name" : "Item 3" } { "slno" : NumberInt(2), "name" : "Item 1" } { "slno" : NumberInt(3), "name" : "Item 2" } I am using Mongoose 5.0 ORM with Node 6.11 and express 4.15. Please help me find the best way to achieve this.
You basically want bulkWrite(), which can take the input array of objects and use it to make a "batch" of requests to update the matched documents. Presuming the array of documents is being sent in req.body.updates, then you would have something like const Model = require('../models/model'); router.post('/update', (req,res) => { Model.bulkWrite( req.body.updates.map(({ slno, name }) => ({ updateOne: { filter: { slno }, update: { $set: { name } } } }) ) }) .then(result => { // maybe do something with the WriteResult res.send("ok"); // or whatever response }) .catch(e => { // do something with any error }) }) This sends a request given the input as: bulkWrite([ { updateOne: { filter: { slno: 1 }, update: { '$set': { name: 'Item 3' } } } }, { updateOne: { filter: { slno: 2 }, update: { '$set': { name: 'Item 1' } } } }, { updateOne: { filter: { slno: 3 }, update: { '$set': { name: 'Item 2' } } } } ] ) Which efficiently performs all updates in a single request to the server with a single response. Also see the core MongoDB documentation on bulkWrite(). That's the documentation for the mongo shell method, but all the options and syntax are exactly the same in most drivers and especially within all JavaScript based drivers. As a full working demonstration of the method in use with mongoose: const { Schema } = mongoose = require('mongoose'); const uri = 'mongodb://localhost/test'; mongoose.Promise = global.Promise; mongoose.set('debug',true); const testSchema = new Schema({ slno: Number, name: String }); const Test = mongoose.model('Test', testSchema); const log = data => console.log(JSON.stringify(data, undefined, 2)); const data = [1,2,3].map(n => ({ slno: n, name: `Item ${n}` })); const request = [[1,3],[2,1],[3,2]] .map(([slno, n]) => ({ slno, name: `Item ${n}` })); mongoose.connect(uri) .then(conn => Promise.all(Object.keys(conn.models).map( k => conn.models[k].remove())) ) .then(() => Test.insertMany(data)) .then(() => Test.bulkWrite( request.map(({ slno, name }) => ({ updateOne: { filter: { slno }, update: { $set: { name } } } }) ) )) .then(result => log(result)) .then(() => Test.find()) .then(data => log(data)) .catch(e => console.error(e)) .then(() => mongoose.disconnect()); Or for more modern environments with async/await: const { Schema } = mongoose = require('mongoose'); const uri = 'mongodb://localhost/test'; mongoose.Promise = global.Promise; mongoose.set('debug',true); const testSchema = new Schema({ slno: Number, name: String }); const Test = mongoose.model('Test', testSchema); const log = data => console.log(JSON.stringify(data, undefined, 2)); const data = [1,2,3].map(n => ({ slno: n, name: `Item ${n}` })); const request = [[1,3],[2,1],[3,2]] .map(([slno,n]) => ({ slno, name: `Item ${n}` })); (async function() { try { const conn = await mongoose.connect(uri) await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove())); await Test.insertMany(data); let result = await Test.bulkWrite( request.map(({ slno, name }) => ({ updateOne: { filter: { slno }, update: { $set: { name } } } }) ) ); log(result); let current = await Test.find(); log(current); mongoose.disconnect(); } catch(e) { console.error(e) } finally { process.exit() } })() Which loads the initial data and then updates, showing the response object ( serialized ) and the resulting items in the collection after the update is processed: Mongoose: tests.remove({}, {}) Mongoose: tests.insertMany([ { _id: 5b1b89348f3c9e1cdb500699, slno: 1, name: 'Item 1', __v: 0 }, { _id: 5b1b89348f3c9e1cdb50069a, slno: 2, name: 'Item 2', __v: 0 }, { _id: 5b1b89348f3c9e1cdb50069b, slno: 3, name: 'Item 3', __v: 0 } ], {}) Mongoose: tests.bulkWrite([ { updateOne: { filter: { slno: 1 }, update: { '$set': { name: 'Item 3' } } } }, { updateOne: { filter: { slno: 2 }, update: { '$set': { name: 'Item 1' } } } }, { updateOne: { filter: { slno: 3 }, update: { '$set': { name: 'Item 2' } } } } ], {}) { "ok": 1, "writeErrors": [], "writeConcernErrors": [], "insertedIds": [], "nInserted": 0, "nUpserted": 0, "nMatched": 3, "nModified": 3, "nRemoved": 0, "upserted": [], "lastOp": { "ts": "6564991738253934601", "t": 20 } } Mongoose: tests.find({}, { fields: {} }) [ { "_id": "5b1b89348f3c9e1cdb500699", "slno": 1, "name": "Item 3", "__v": 0 }, { "_id": "5b1b89348f3c9e1cdb50069a", "slno": 2, "name": "Item 1", "__v": 0 }, { "_id": "5b1b89348f3c9e1cdb50069b", "slno": 3, "name": "Item 2", "__v": 0 } ] That's using syntax which is compatible with NodeJS v6.x
A small change in Neil Lunn's answer did the job. const Model = require('../models/model'); router.post('/update', (req,res) => { var tempArray=[]; req.body.updates.map(({slno,name}) => { tempArray.push({ updateOne: { filter: {slno}, update: {$set: {name}} } }); }); Model.bulkWrite(tempArray).then((result) => { //Send resposne }).catch((err) => { // Handle error }); Thanks to Neil Lunn.
Sequelize OR condition object
By creating object like this var condition= { where: { LastName:"Doe", FirstName:["John","Jane"], Age:{ gt:18 } } } and pass it in Student.findAll(condition) .success(function(students){ }) It could beautifully generate SQL like this "SELECT * FROM Student WHERE LastName='Doe' AND FirstName in ("John","Jane") AND Age>18" However, It is all 'AND' condition, how could I generate 'OR' condition by creating a condition object?
Seems there is another format now where: { LastName: "Doe", $or: [ { FirstName: { $eq: "John" } }, { FirstName: { $eq: "Jane" } }, { Age: { $gt: 18 } } ] } Will generate WHERE LastName='Doe' AND (FirstName = 'John' OR FirstName = 'Jane' OR Age > 18) See the doc: http://docs.sequelizejs.com/en/latest/docs/querying/#where
String based operators will be deprecated in the future (You've probably seen the warning in console). Getting this to work with symbolic operators was quite confusing for me, and I've updated the docs with two examples. Post.findAll({ where: { [Op.or]: [{authorId: 12}, {authorId: 13}] } }); // SELECT * FROM post WHERE authorId = 12 OR authorId = 13; Post.findAll({ where: { authorId: { [Op.or]: [12, 13] } } }); // SELECT * FROM post WHERE authorId = 12 OR authorId = 13;
Use Sequelize.or: var condition = { where: Sequelize.and( { name: 'a project' }, Sequelize.or( { id: [1,2,3] }, { id: { lt: 10 } } ) ) }; Reference (search for Sequelize.or) Edit: Also, this has been modified and for the latest method see Morio's answer,
In Sequelize version 5 you might also can use this way (full use Operator Sequelize) : var condition = { [Op.or]: [ { LastName: { [Op.eq]: "Doe" }, }, { FirstName: { [Op.or]: ["John", "Jane"] } }, { Age:{ [Op.gt]: 18 } } ] } And then, you must include this : const Op = require('Sequelize').Op and pass it in : Student.findAll(condition) .success(function(students){ // }) It could beautifully generate SQL like this : "SELECT * FROM Student WHERE LastName='Doe' OR FirstName in ("John","Jane") OR Age>18"
For Sequelize 4 Query SELECT * FROM Student WHERE LastName='Doe' AND (FirstName = "John" or FirstName = "Jane") AND Age BETWEEN 18 AND 24 Syntax with Operators const Op = require('Sequelize').Op; var r = await to (Student.findAll( { where: { LastName: "Doe", FirstName: { [Op.or]: ["John", "Jane"] }, Age: { // [Op.gt]: 18 [Op.between]: [18, 24] } } } )); Notes Avoid alias operators $ (e.g $and, $or ...) as these will be deprecated Unless you have {freezeTableName: true} set in the table model then Sequelize will query against the plural form of its name ( Student -> Students )
See the docs about querying. It would be: $or: [{a: 5}, {a: 6}] // (a = 5 OR a = 6)
where: { [Op.or]: [ { id: { [Op.in]: recordId, }, }, { id: { [Op.eq]: recordId, }, }, ], }, This Works For Me !
For those who are facing issue in making more complex query like - // where email = 'xyz#mail.com' AND (( firstname = 'first' OR lastname = 'last' ) AND age > 18) would be: [Op.and]: [ { "email": { [Op.eq]: 'xyz#mail.com' } // OR "email": 'xyz#mail.com' }, { [Op.and]: [ { [Op.or]: [ { "firstname": "first" }, { "lastname": "last" } ] }, { "age": { [Op.gt]: 18 } }] } ]