I have a problem to writing a mongodb query to nested updating an array of objects.
My current document in the collection is like this:
{
"_id": 1,
"data": {
"arr_1": [
{
"inner_arr_1": [
{
"aid": "111",
"ans": "def"
},
{
"aid": "222",
"ans": "def"
},
],
"inner_arr_2": [
{
"aid": "333",
"ans": "def"
},
{
"aid": "444",
"ans": "def"
},
],
"inner_arr_3": [
{
"aid": "555",
"ans": "def"
},
{
"aid": "666",
"ans": "def"
},
]
},
]
},
"name": "John"
}
I want to write an API endpint by express in nodejs to nested update this document.
The post body of the endpoint is like:
req.body = {
"_id": 1,
"data": {
"arr_1": [
{
"inner_arr_1": [
{
"aid": "111",
"ans": "new val"
}
],
"inner_arr_3": [
{
"aid": "666",
"ans": "new val"
}
]
},
]
}
}
Anyone can help me to write a query to solve this problem?
Thank you.
Related
I have three documents called location, company and vouchers and they structured as follows,
"company": [
{
"_id": "625ae79a51828244cef979d4",
"name": "C1",
"location": "L1",
"category": "A"
},
{
"_id": "625ba41651828244cefa138b",
"name": "C2",
"location": "L2",
"category": "B"
},
{
"_id": "625ba4d651828244cefa1951",
"name": "C3",
"location": "L3",
"category": "B"
}
]
"vouchers":[
{
"_id":"625ae79a51828244cef979d4",
"name":"V1",
"color":"#ad7f7f",
"category":[
"A",
"B"
]
},
{
"_id":"625ba41651828244cefa138b",
"name":"V2",
"color":"#9A348E",
"category":[
"A"
]
},
{
"_id":"625ba4d651828244cefa1951",
"name":"V3",
"color":"#31263E",
"category":[
"B"
]
},
{
"_id":"625ba4d651828244cefa1951",
"name":"V4",
"color":"#31263E",
"category":[
"C"
]
}
]
and the expected result should be
"companies":[
{
"_id":"625ae79a51828244cef979d4",
"name":"C1",
"location":"L1",
"category":"A",
"vouchers":[
{
"_id":"625ae79a51828244cef979d4",
"name":"V1",
"color":"#ad7f7f",
"category":[
"A",
"B"
]
},
{
"_id":"625ba41651828244cefa138b",
"name":"V2",
"color":"#9A348E",
"category":[
"A"
]
}
]
},
{
"_id":"625ba41651828244cefa138b",
"name":"C2",
"location":"L2",
"category":"B",
"vouchers":[
{
"_id":"625ae79a51828244cef979d4",
"name":"V1",
"color":"#ad7f7f",
"category":[
"A",
"B"
]
},
{
"_id":"625ba4d651828244cefa1951",
"name":"V3",
"color":"#31263E",
"category":[
"B"
]
}
]
},
{
"_id":"625ba4d651828244cefa1951",
"name":"C3",
"location":"L3",
"category":"C",
"vouchers":[
{
"_id":"625ba4d651828244cefa1951",
"name":"V4",
"color":"#31263E",
"category":[
"C"
]
}
]
}
]
I am trying to get the expected result by using mongodb aggregation pipeline. First I filtered location documents with $geoWithin and used $lookup to find the related companies. Next I should get all the vouchers which having same category as company. I applied bellow code to previous result to get vouchers.
$lookup: {
from: "vouchers",
let: { category: "$company.category" },
pipeline: [
{
$match: {
$expr: {
$and: [{ $in: ["$category", "$$category"] }],
},
},
},
],
as: "vouchers",
},
But it gives empty result after this stage. Any suggestion would be appreciated.
consider the following document skeleton
{
_id: "615749dce3438547adfff9bc",
items: [
{
type: "shirt",
color: "red",
sizes: [
{
label: "medium",
stock: 10,
price: 20,
},
{
label: "large",
stock: 30,
price: 40,
}
]
},
{
type: "shirt",
color: "green",
sizes: [
{
label: "small",
stock: 5,
price: 3,
},
{
label: "medium",
stock: 5,
price: 3,
},
]
}
]
}
when a new item comes in, I want to insert a new document to items, unless an item exists with the same type and color as the new one, in this case I want only to merge sizes into that existing item's sizes.
sizes does not have to be unique.
I tried to use $push with upsert: true and arrayFilters but apparently $push ignores arrayFilters.
node with mongodb package.
Query1
filter to see if exists
if exists map to update, else add in the end
*2 array reads, but stil faster than query2
Test code here
db.collection.update({},
[
{
"$set": {
"newitem": {
"type": "shirt",
"color": "red",
"sizes": [
{
"label": "medium"
}
]
}
}
},
{
"$set": {
"found": {
"$ne": [
{
"$filter": {
"input": "$items",
"cond": {
"$and": [
{
"$eq": [
"$$this.type",
"$newitem.type"
]
},
{
"$eq": [
"$$this.color",
"$newitem.color"
]
}
]
}
}
},
[]
]
}
}
},
{
"$set": {
"items": {
"$cond": [
{
"$not": [
"$found"
]
},
{
"$concatArrays": [
"$items",
[
"$newitem"
]
]
},
{
"$map": {
"input": "$items",
"in": {
"$cond": [
{
"$and": [
{
"$eq": [
"$$this.type",
"$newitem.type"
]
},
{
"$eq": [
"$$this.color",
"$newitem.color"
]
}
]
},
{
"$mergeObjects": [
"$$this",
{
"sizes": {
"$concatArrays": [
"$$this.sizes",
"$newitem.sizes"
]
}
}
]
},
"$$this"
]
}
}
}
]
}
}
},
{
"$unset": [
"found",
"newitem"
]
}
])
Query2
(alternative solution)
reduce and do the update
if found keep the updated, else add in the end
*1 array read (but concat is slow, for big arrays, >500 members, if you have big arrays use query1)
*this is the normal way to do it, if we had a fast way to add in the end of the array, but we dont, so Query1 is faster
Test code here
db.collection.update({},
[
{
"$set": {
"newitem": {
"type": "shirt",
"color": "red",
"sizes": [
{
"label": "medium"
}
]
}
}
},
{
"$set": {
"items-found": {
"$reduce": {
"input": "$items",
"initialValue": {
"items": [],
"found": null
},
"in": {
"$cond": [
{
"$and": [
{
"$eq": [
"$$value.found",
null
]
},
{
"$eq": [
"$$this.type",
"$newitem.type"
]
},
{
"$eq": [
"$$this.color",
"$newitem.color"
]
}
]
},
{
"items": {
"$concatArrays": [
"$$value.items",
[
{
"$mergeObjects": [
"$$this",
{
"sizes": {
"$concatArrays": [
"$$this.sizes",
"$newitem.sizes"
]
}
}
]
}
]
]
},
"found": true
},
{
"items": {
"$concatArrays": [
"$$value.items",
[
"$$this"
]
]
},
"found": "$$value.found"
}
]
}
}
}
}
},
{
"$set": {
"items": {
"$cond": [
"$items-found.found",
"$items-found.items",
{
"$concatArrays": [
"$items-found.items",
[
"$newitem"
]
]
}
]
}
}
},
{
"$unset": [
"items-found",
"newitem"
]
}
])
My mongo collection name tests and whose having the following documents in it.
[
{
"title": "One",
"uid": "1",
"_metadata": {
"references": [
{
"uid": "2"
},
{
"asssetuid": 10
}
]
}
},
{
"title": "Two",
"uid": "2",
"_metadata": {
"references": [
{
"uid": "3"
},
{
"asssetuid": 11
}
]
}
},
{
"title": "Three",
"uid": "3",
"_metadata": {
"references": []
}
}
]
And I want the result in the following format (for uid:1)
[
{
"title": "One",
"uid": 1,
"_metadata": {
"references": [
{
"asssetuid": 10
},
{
"asssetuid": 11
},
{
"title": "Two",
"uid": "2",
"_metadata": {
"references": [
{
"title": "Three",
"uid": "3"
}
]
}
}
]
}
}
]
for uid:2 I want the following result
[
{
"title": "Two",
"uid": 2,
"_metadata": {
"references": [
{
"asssetuid": 11
},
{
"title": "Three",
"uid": "3"
}
]
}
}
]
Which query I used here to get a respected result. according to its uid. here I want the result in the parent-child relationship. is this possible using MongoDB graph lookup query or any other query that we can use to get the result. Please help me with this.
New Type Output
[{
"title": "One",
"uid": 1,
"_metadata": {
"assets": [{
"asssetuid": 10,
"parent": 1
}, {
"asssetuid": 11,
"parent": 2
}],
"entries": [{
"title": "Two",
"uid": "2",
"parent": 1
}, {
"title": "Three",
"uid": "3",
"parent": 2
}]
}
}]
Mongo supports the automatic reference resolution using $ref but for that, you need to change your schema a little and resolve resolution is only supported by some drivers.
You need to store your data in this format:
[
...
{
"_id": ObjectId("5a934e000102030405000000"),
"_metadata": {
"references": [
{
"$ref": "collection",
"$id": ObjectId("5a934e000102030405000001"),
"$db": "database"
},
{
"asssetuid": 10
}
]
},
"title": "One",
"uid": "1"
},
....
]
For more details on $ref refer to official documentation: label-document-references
OR
you can resolve the reference using the $graphLookup but the only problem with the $graphlookup is that you will lose the assetuid. Here is the query and it will resolve references and give output in flat map
db.collection.aggregate([
{
$match: {
uid: "1"
}
},
{
$graphLookup: {
from: "collection",
startWith: "$_metadata.references.uid",
connectFromField: "_metadata.references.uid",
connectToField: "uid",
depthField: "depth",
as: "resolved"
}
},
{
"$addFields": {
"references": "$resolved",
"metadata": [
{
"_metadata": "$_metadata"
}
]
}
},
{
"$project": {
"references._metadata": 0,
}
},
{
"$project": {
"references": "$references",
"merged": {
"$concatArrays": [
"$metadata",
"$resolved"
]
}
}
},
{
"$project": {
results: [
{
merged: "$merged"
},
{
references: "$references"
}
]
}
},
{
"$unwind": "$results"
},
{
"$facet": {
"assest": [
{
"$match": {
"results.merged": {
"$exists": true
}
}
},
{
"$unwind": "$results.merged"
},
{
"$unwind": "$results.merged._metadata.references"
},
{
"$match": {
"results.merged._metadata.references.asssetuid": {
"$exists": true
}
}
},
{
"$project": {
_id: 0,
"asssetuid": "$results.merged._metadata.references.asssetuid"
}
}
],
"uid": [
{
"$match": {
"results.references": {
"$exists": true
}
}
},
{
"$unwind": "$results.references"
},
{
$replaceRoot: {
newRoot: "$results.references"
}
}
]
}
},
{
"$project": {
"references": {
"$concatArrays": [
"$assest",
"$uid"
]
}
}
}
])
Here is the link to the playground to test it: Mongo Playground
I used node mongoose.
I need to update this array push new item into Breackfast(mealList.foodList.breackfast || any),
I want to add new foodlist by time can you please give me suggestion for how to do,
{
"_id": "5fe43eb44cd6820963c98c32",
"name": "Monday Diet",
"userID": "5f225d7458b48d0fe897662e",
"day": "Monday",
"type": "private",
"mealList": [
{
"_id": "5fe43eb44cd6820963c98c33",
"time": "Breakfast",
"foodList": [
{
"_id": "5fe43eb44cd6820963c98c34",
"foodName": "Eggs",
"Qty": "2",
"calories": "calories",
"category": "category"
}
]
},
{
"_id": "5fe43eb44cd6820963c98c36",
"time": "Lunch",
"foodList": [
{
"_id": "5fe43eb44cd6820963c98c37",
"foodName": "food1",
"Qty": "100g"
},
]
}
],
"createdAt": "2020-12-24T07:09:40.141Z",
"updatedAt": "2020-12-24T07:09:40.141Z",
"__v": 0
}
I tried:
Diet.updateOne(
{ "Diet.mealList._id": req.body.mealId },
// { $push: { "Diet.0.mealList.$.foodList": req.body.foodList } },
{ $push: { foodList: req.body.foodList } }
)
Few Fixes:
convert your string _id to object type using mongoose.Types.ObjectId
remove Diet from first and push object in foodList
Diet.updateOne({
"mealList._id": mongoose.Types.ObjectId(req.body.mealId)
},
{
$push: {
"mealList.$.foodList": req.body.foodList
}
})
Playground
Cloudant is returning error message:
{"error":"invalid_key","reason":"Invalid key use-index for this request."}
whenever I try to query against an index with the combination operator, "$or".
A sample of what my documents look like is:
{
"_id": "28f240f1bcc2fbd9e1e5174af6905349",
"_rev": "1-fb9a9150acbecd105f1616aff88c26a8",
"type": "Feature",
"properties": {
"PageName": "A8",
"PageNumber": 1,
"Lat": 43.051523,
"Long": -71.498852
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[
-71.49978935969642,
43.0508382914137
],
[
-71.49978564033566,
43.052210148524
],
[
-71.49791499857444,
43.05220740550381
],
[
-71.49791875962663,
43.05083554852429
],
[
-71.49978935969642,
43.0508382914137
]
]
]
}
}
The index that I created is for field "properties.PageName", which works fine when I'm just querying for one document, but as soon as I try for multiple ones, I would receive the error response as quoted in the beginning.
If it helps any, here is the call:
POST https://xyz.cloudant.com/db/_find
request body:
{
"selector": {
"$or": [
{ "properties.PageName": "A8" },
{ "properties.PageName": "M30" },
{ "properties.PageName": "AH30" }
]
},
"use-index": "pagename-index"
}
In order to perform an $or query you need to create a text (full text) index, rather than a json index. For example, I just created the following index:
{
"index": {
"fields": [
{"name": "properties.PageName", "type": "string"}
]
},
"type": "text"
}
I was then be able to perform the following query:
{
"selector": {
"$or": [
{ "properties.PageName": "A8" },
{ "properties.PageName": "M30" },
{ "properties.PageName": "AH30" }
]
}
}