Related
I am trying to implementing data-loader in my project-
Here is my code-
subCategory: async (parent, _, {loaders}) => {
console.log(parent);
// I try it but getting errors
const subcategory = await loaders.subCategory.load(parent.subCategory.toString());
return subcategory;
}
The above code's console result-
{
_id: new ObjectId("61a658932c7ad474fde70893"),
name: "Men's",
createdAt: 2021-11-30T17:00:03.225Z,
updatedAt: 2021-12-05T13:52:26.848Z,
__v: 2,
subCategory: [ // Array of IDS
new ObjectId("61acc3f8a49c6cf3b04cf225"),
new ObjectId("61acc41aa49c6cf3b04cf22b")
]
}
{
_id: new ObjectId("61a65980890ef431bc0ecbcf"),
name: 'Women',
createdAt: 2021-11-30T17:04:00.530Z,
updatedAt: 2021-12-05T13:54:00.449Z,
__v: 3,
subCategory: [ // Array of IDS
new ObjectId("61acc460dba0e6083be97fb5"),
new ObjectId("61acc471dba0e6083be97fbd"),
new ObjectId("61acc478dba0e6083be97fc3")
]
}
I try it with this function--
module.exports.batchSubCategory = async (categoryIds) => {
const subcategory = await Subcategory.find({_id: {$in: Ids}});
return Ids.map(categoryId => subcategory.find(category => category.id === categoryId));
}
Here is context
context: async ({req}) => {
return {
loaders: {
subCategory: new Dataloader(keys => loaders.categoryLoaders.batchSubCategory(keys))
}
}
},
I am using apollo server.
I have also encountered this type of problem.
The solution was to use the .loadMany() Method.
You can also refer to this URL https://medium.com/the-marcy-lab-school/how-to-use-dataloader-js-9727c527efd0.
Just try
Instead of .load() method:
const subcategory = await loaders.subCategory.load(parent.subCategory.toString());
Use the .loadMany() Method:
const subcategory = await loaders.subCategory.loadMany(parent.subCategory);
Let's say I have two objects: Product and Seller
Products can have multiple Sellers.
A single Seller can sell multiple Products.
The goal is to write a seeding script that successfully seeds my MongoDB database such that Keystone.js's CMS recognizes the many-to-many relationship.
Schemas
Product.ts
import { text, relationship } from "#keystone-next/fields";
import { list } from "#keystone-next/keystone/schema";
export const Product = list({
fields: {
name: text({ isRequired: true }),
sellers: relationship({
ref: "Seller.products",
many: true,
}),
},
});
Seller.ts
import { text, relationship } from "#keystone-next/fields";
import { list } from "#keystone-next/keystone/schema";
export const Product = list({
fields: {
name: text({ isRequired: true }),
products: relationship({
ref: "Product.sellers",
many: true,
}),
},
});
KeystoneJS config
My keystone.ts config, shortened for brevity, looks like this:
import { insertSeedData } from "./seed-data"
...
db: {
adapter: "mongoose",
url: databaseURL,
async onConnect(keystone) {
console.log("Connected to the database!");
if (process.argv.includes("--seed-data")) {
await insertSeedData(keystone);
}
},
},
lists: createSchema({
Product,
Seller,
}),
...
Seeding Scripts (these are the files I expect to change)
I have a script that populates the database (seed-data/index.ts):
import { products } from "./data";
import { sellers } from "./data";
export async function insertSeedData(ks: any) {
// setup code
const keystone = ks.keystone || ks;
const adapter = keystone.adapters?.MongooseAdapter || keystone.adapter;
const { mongoose } = adapter;
mongoose.set("debug", true);
// adding products to DB
for (const product of products) {
await mongoose.model("Product").create(product);
}
// adding sellers to DB
for (const seller of sellers) {
await mongoose.model("Seller").create(seller);
}
}
And finally, data.ts looks something like this:
export const products = [
{
name: "apple",
sellers: ["Joe", "Anne", "Duke", "Alicia"],
},
{
name: "orange",
sellers: ["Duke", "Alicia"],
},
...
];
export const sellers = [
{
name: "Joe",
products: ["apple", "banana"],
},
{
name: "Duke",
products: ["apple", "orange", "banana"],
},
...
];
The above setup does not work for a variety of reasons. The most obvious is that the sellers and products attributes of the Product and Seller objects (respectively) should reference objects (ObjectId) and not names (e.g. "apple", "Joe").
I'll post a few attempts below that I thought would work, but did not:
Attempt 1
I figured I'd just give them temporary ids (the id attribute in data.ts below) and then, once MongoDB assigns an ObjectId, I'll use those.
seed-data/index.ts
...
const productIdsMapping = [];
...
// adding products to DB
for (const product of products) {
const productToPutInMongoDB = { name: product.name };
const { _id } = await mongoose.model("Product").create(productToPutInMongoDB);
productIdsMapping.push(_id);
}
// adding sellers to DB (using product IDs created by MongoDB)
for (const seller of sellers) {
const productMongoDBIds = [];
for (const productSeedId of seller.products) {
productMongoDBIds.push(productIdsMapping[productSeedId]);
const sellerToPutInMongoDB = { name: seller.name, products: productMongoDBIds };
await mongoose.model("Seller").create(sellerToPutInMongoDB);
}
...
data.ts
export const products = [
{
id: 0,
name: "apple",
sellers: [0, 1, 2, 3],
},
{
id: 1,
name: "orange",
sellers: [2, 3],
},
...
];
export const sellers = [
{
id: 0
name: "Joe",
products: [0, 2],
},
...
{
id: 2
name: "Duke",
products: [0, 1, 2],
},
...
];
Output (attempt 1):
It just doesn't seem to care about or acknowledge the products attribute.
Mongoose: sellers.insertOne({ _id: ObjectId("$ID"), name: 'Joe', __v: 0}, { session: null })
{
results: {
_id: $ID,
name: 'Joe',
__v: 0
}
}
Attempt 2
I figured maybe I just didn't format it correctly, for some reason, so maybe if I queried the products and shoved them directly into the seller object, that would work.
seed-data/index.ts
...
const productIdsMapping = [];
...
// adding products to DB
for (const product of products) {
const productToPutInMongoDB = { name: product.name };
const { _id } = await mongoose.model("Product").create(productToPutInMongoDB);
productIdsMapping.push(_id);
}
// adding sellers to DB (using product IDs created by MongoDB)
for (const seller of sellers) {
const productMongoDBIds = [];
for (const productSeedId of seller.products) {
productMongoDBIds.push(productIdsMapping[productSeedId]);
}
const sellerToPutInMongoDB = { name: seller.name };
const { _id } = await mongoose.model("Seller").create(sellerToPutInMongoDB);
const resultsToBeConsoleLogged = await mongoose.model("Seller").findByIdAndUpdate(
_id,
{
$push: {
products: productMongoDBIds,
},
},
{ new: true, useFindAndModify: false, upsert: true }
);
}
...
data.ts
Same data.ts file as attempt 1.
Output (attempt 2):
Same thing. No luck on the products attribute appearing.
Mongoose: sellers.insertOne({ _id: ObjectId("$ID"), name: 'Joe', __v: 0}, { session: null })
{
results: {
_id: $ID,
name: 'Joe',
__v: 0
}
}
So, now I'm stuck. I figured attempt 1 would Just Work⢠like this answer:
https://stackoverflow.com/a/52965025
Any thoughts?
I figured out a solution. Here's the background:
When I define the schema, Keystone creates corresponding MongoDB collections. If there is a many-to-many relationship between object A and object B, Keystone will create 3 collections: A, B, and A_relationshipToB_B_relationshipToA.
That 3rd collection is the interface between the two. It's just a collection with pairs of ids from A and B.
Hence, in order to seed my database with a many-to-many relationship that shows up in the Keystone CMS, I have to seed not only A and B, but also the 3rd collection: A_relationshipToB_B_relationshipToA.
Hence, seed-data/index.ts will have some code that inserts into that table:
...
for (const seller of sellers) {
const sellerToAdd = { name: seller.name };
const { _id } = await mongoose.model("Seller").create(sellerToAdd);
// Product_sellers_Seller_products Insertion
for (const productId of seller.products) {
await mongoose
.model("Product_sellers_Seller_products")
.create({
Product_left_id: productIds[productId], // (data.ts id) --> (Mongo ID)
Seller_right_id: _id,
});
}
}
...
I am using MongoDB Atlas cloud(https://cloud.mongodb.com/) and Mongoose library.
I tried to create multiple documents using transaction concept, but it is not working.
I am not getting any error. but, it seems rollback is not working properly.
app.js
//*** more code here
var app = express();
require('./models/db');
//*** more code here
models/db.js
var mongoose = require( 'mongoose' );
// Build the connection string
var dbURI = 'mongodb+srv://mydb:pass#cluster0-****.mongodb.net/mydb?retryWrites=true';
// Create the database connection
mongoose.connect(dbURI, {
useCreateIndex: true,
useNewUrlParser: true,
});
// Get Mongoose to use the global promise library
mongoose.Promise = global.Promise;
models/user.js
const mongoose = require("mongoose");
const UserSchema = new mongoose.Schema({
userName: {
type: String,
required: true
},
pass: {
type: String,
select: false
}
});
module.exports = mongoose.model("User", UserSchema, "user");
myroute.js
const db = require("mongoose");
const User = require("./models/user");
router.post("/addusers", async (req, res, next) => {
const SESSION = await db.startSession();
await SESSION.startTransaction();
try {
const newUser = new User({
//*** data for user ***
});
await newUser.save();
//*** for test purpose, trigger some error ***
throw new Error("some error");
await SESSION.commitTransaction();
//*** return data
} catch (error) {
await SESSION.abortTransaction();
} finally {
SESSION.endSession();
}
});
Above code works without error, but it still creates user in the DB. It suppose to rollback the created user and the collection should be empty.
I don't know what I have missed here. Can anyone please let me know whats wrong here?
app, models, schema and router are in different files.
You need to include the session within the options for all read/write operations which are active during a transaction. Only then are they actually applied to the transaction scope where you are able to roll them back.
As a bit more complete listing, and just using the more classic Order/OrderItems modelling which should be pretty familiar to most people with some relational transactions experience:
const { Schema } = mongoose = require('mongoose');
// URI including the name of the replicaSet connecting to
const uri = 'mongodb://localhost:27017/trandemo?replicaSet=fresh';
const opts = { useNewUrlParser: true };
// sensible defaults
mongoose.Promise = global.Promise;
mongoose.set('debug', true);
mongoose.set('useFindAndModify', false);
mongoose.set('useCreateIndex', true);
// schema defs
const orderSchema = new Schema({
name: String
});
const orderItemsSchema = new Schema({
order: { type: Schema.Types.ObjectId, ref: 'Order' },
itemName: String,
price: Number
});
const Order = mongoose.model('Order', orderSchema);
const OrderItems = mongoose.model('OrderItems', orderItemsSchema);
// log helper
const log = data => console.log(JSON.stringify(data, undefined, 2));
// main
(async function() {
try {
const conn = await mongoose.connect(uri, opts);
// clean models
await Promise.all(
Object.entries(conn.models).map(([k,m]) => m.deleteMany())
)
let session = await conn.startSession();
session.startTransaction();
// Collections must exist in transactions
await Promise.all(
Object.entries(conn.models).map(([k,m]) => m.createCollection())
);
let [order, other] = await Order.insertMany([
{ name: 'Bill' },
{ name: 'Ted' }
], { session });
let fred = new Order({ name: 'Fred' });
await fred.save({ session });
let items = await OrderItems.insertMany(
[
{ order: order._id, itemName: 'Cheese', price: 1 },
{ order: order._id, itemName: 'Bread', price: 2 },
{ order: order._id, itemName: 'Milk', price: 3 }
],
{ session }
);
// update an item
let result1 = await OrderItems.updateOne(
{ order: order._id, itemName: 'Milk' },
{ $inc: { price: 1 } },
{ session }
);
log(result1);
// commit
await session.commitTransaction();
// start another
session.startTransaction();
// Update and abort
let result2 = await OrderItems.findOneAndUpdate(
{ order: order._id, itemName: 'Milk' },
{ $inc: { price: 1 } },
{ 'new': true, session }
);
log(result2);
await session.abortTransaction();
/*
* $lookup join - expect Milk to be price: 4
*
*/
let joined = await Order.aggregate([
{ '$match': { _id: order._id } },
{ '$lookup': {
'from': OrderItems.collection.name,
'foreignField': 'order',
'localField': '_id',
'as': 'orderitems'
}}
]);
log(joined);
} catch(e) {
console.error(e)
} finally {
mongoose.disconnect()
}
})()
So I would generally recommend calling the variable session in lowercase, since this is the name of the key for the "options" object where it is required on all operations. Keeping this in the lowercase convention allows for using things like the ES6 Object assignment as well:
const conn = await mongoose.connect(uri, opts);
...
let session = await conn.startSession();
session.startTransaction();
Also the mongoose documentation on transactions is a little misleading, or at least it could be more descriptive. What it refers to as db in the examples is actually the Mongoose Connection instance, and not the underlying Db or even the mongoose global import as some may misinterpret this. Note in the listing and above excerpt this is obtained from mongoose.connect() and should be kept within your code as something you can access from a shared import.
Alternately you can even grab this in modular code via the mongoose.connection property, at any time after a connection has been established. This is usually safe inside things such as server route handlers and the like since there will be a database connection by the time that code is called.
The code also demonstrates the session usage in the different model methods:
let [order, other] = await Order.insertMany([
{ name: 'Bill' },
{ name: 'Ted' }
], { session });
let fred = new Order({ name: 'Fred' });
await fred.save({ session });
All the find() based methods and the update() or insert() and delete() based methods all have a final "options block" where this session key and value are expected. The save() method's only argument is this options block. This is what tells MongoDB to apply these actions to the current transaction on that referenced session.
In much the same way, before a transaction is committed any requests for a find() or similar which do not specify that session option do not see the state of the data whilst that transaction is in progress. The modified data state is only available to other operations once the transaction completes. Note this has effects on writes as covered in the documentation.
When an "abort" is issued:
// Update and abort
let result2 = await OrderItems.findOneAndUpdate(
{ order: order._id, itemName: 'Milk' },
{ $inc: { price: 1 } },
{ 'new': true, session }
);
log(result2);
await session.abortTransaction();
Any operations on the active transaction are removed from state and are not applied. As such they are not visible to resulting operations afterwards. In the example here the value in the document is incremented and will show a retrieved value of 5 on the current session. However after session.abortTransaction() the previous state of the document is reverted. Note that any global context which was not reading data on the same session, does not see that state change unless committed.
That should give the general overview. There is more complexity that can be added to handle varying levels of write failure and retries, but that is already extensively covered in documentation and many samples, or can be answered to a more specific question.
Output
For reference, the output of the included listing is shown here:
Mongoose: orders.deleteMany({}, {})
Mongoose: orderitems.deleteMany({}, {})
Mongoose: orders.insertMany([ { _id: 5bf775986c7c1a61d12137dd, name: 'Bill', __v: 0 }, { _id: 5bf775986c7c1a61d12137de, name: 'Ted', __v: 0 } ], { session: ClientSession("80f827fe077044c8b6c0547b34605cb2") })
Mongoose: orders.insertOne({ _id: ObjectId("5bf775986c7c1a61d12137df"), name: 'Fred', __v: 0 }, { session: ClientSession("80f827fe077044c8b6c0547b34605cb2") })
Mongoose: orderitems.insertMany([ { _id: 5bf775986c7c1a61d12137e0, order: 5bf775986c7c1a61d12137dd, itemName: 'Cheese', price: 1, __v: 0 }, { _id: 5bf775986c7c1a61d12137e1, order: 5bf775986c7c1a61d12137dd, itemName: 'Bread', price: 2, __v: 0 }, { _id: 5bf775986c7c1a61d12137e2, order: 5bf775986c7c1a61d12137dd, itemName: 'Milk', price: 3, __v: 0 } ], { session: ClientSession("80f827fe077044c8b6c0547b34605cb2") })
Mongoose: orderitems.updateOne({ order: ObjectId("5bf775986c7c1a61d12137dd"), itemName: 'Milk' }, { '$inc': { price: 1 } }, { session: ClientSession("80f827fe077044c8b6c0547b34605cb2") })
{
"n": 1,
"nModified": 1,
"opTime": {
"ts": "6626894672394452998",
"t": 139
},
"electionId": "7fffffff000000000000008b",
"ok": 1,
"operationTime": "6626894672394452998",
"$clusterTime": {
"clusterTime": "6626894672394452998",
"signature": {
"hash": "AAAAAAAAAAAAAAAAAAAAAAAAAAA=",
"keyId": 0
}
}
}
Mongoose: orderitems.findOneAndUpdate({ order: ObjectId("5bf775986c7c1a61d12137dd"), itemName: 'Milk' }, { '$inc': { price: 1 } }, { session: ClientSession("80f827fe077044c8b6c0547b34605cb2"), upsert: false, remove: false, projection: {}, returnOriginal: false })
{
"_id": "5bf775986c7c1a61d12137e2",
"order": "5bf775986c7c1a61d12137dd",
"itemName": "Milk",
"price": 5,
"__v": 0
}
Mongoose: orders.aggregate([ { '$match': { _id: 5bf775986c7c1a61d12137dd } }, { '$lookup': { from: 'orderitems', foreignField: 'order', localField: '_id', as: 'orderitems' } } ], {})
[
{
"_id": "5bf775986c7c1a61d12137dd",
"name": "Bill",
"__v": 0,
"orderitems": [
{
"_id": "5bf775986c7c1a61d12137e0",
"order": "5bf775986c7c1a61d12137dd",
"itemName": "Cheese",
"price": 1,
"__v": 0
},
{
"_id": "5bf775986c7c1a61d12137e1",
"order": "5bf775986c7c1a61d12137dd",
"itemName": "Bread",
"price": 2,
"__v": 0
},
{
"_id": "5bf775986c7c1a61d12137e2",
"order": "5bf775986c7c1a61d12137dd",
"itemName": "Milk",
"price": 4,
"__v": 0
}
]
}
]
I think this is the quickest way to start performing transaction with mongoose
const mongoose = require("mongoose");
// starting session on mongoose default connection
const session = await mongoose.startSession();
mongoose.connection.transaction(async function executor(session) {
try {
// creating 3 collections in isolation with atomicity
const price = new Price(priceSchema);
const variant = new Variant(variantSchema);
const item = new Item(itemSchema);
await price.save({ session });
await variant.save({ session });
// throw new Error("opps some error in transaction");
return await item.save({ session });
} catch (err) {
console.log(err);
}
});
I've got the Parent Schema:
const parentSchema = new Schema({
name: {
type: String,
},
children: [{
type: Schema.Types.ObjectId,
ref: "Children"
}]
})
And this is the Children Schema:
const childrenSchema = Schema({
name: {
type: String
},
surname: {
type: String
}
})
I have an incoming user register POST request in the following format:
{
"name": "TEST",
"children" : [
{ "name":"test","surname": "test" },
{ "name":"test","surname": "test" }
]
}
Here's the router:
router.post("/register", (req, res, next) => {
const {name, children} = req.body;
let newParent = newParent({
name,
children
});
newParent.save((err, result) => {
// res.send(result) etc.
})
}
This results in the following error:
Cast to Array failed for value "[ { name: 'test', surname: 'test' } ]" at path "children"
How can I save all children and keep in the ref only the children _id so i can later populate the Parent collection?
The children field in the parent is expecting an arrays of ObjectIds but you are passing it an arrays of objects that do not conform to that expectation. Please try saving the children, getting the ids and then using those ids to populate the children field in parent document. Something like below:
children.save()
.then(results => {
childrenids = []
results.foreach[item => childrenids.push(result._id)]
newParent.children = chilrenids
newParent.save()
.then(results => res.send({results})
})
To save childData in Parents, You need to save first child's data in children schema Then get childIds and save to Parent Data.
Working Example:
let req = {
"name" : "TEST",
"children" : [
{ "name":"test","surname": "test" },
{ "name":"test","surname": "test" }
]
}
Children.collection.insert(req.children, function (err, docs) {
if (err){
conasolw.log(err);
} else {
var ids = docs.ops.map(doc=>{ return doc._id});;
console.log(ids);
let newParent = Parent({
name : req.name,
children : ids
});
newParent.save((err, result) => {
console.log('parent save');
console.log(err);
console.log(result);
})
}
});
Note :
Test on "mongoose": "^5.3.3"
I'm designing a web application that manages organizational structure for parent and child companies. There are two types of companies: 1- Main company, 2 -Subsidiary company.The company can belong only to one company but can have a few child companies. My mongoose Schema looks like this:
var companySchema = new mongoose.Schema({
companyName: {
type: String,
required: true
},
estimatedAnnualEarnings: {
type: Number,
required: true
},
companyChildren: [{type: mongoose.Schema.Types.ObjectId, ref: 'Company'}],
companyType: {type: String, enum: ['Main', 'Subsidiary']}
})
module.exports = mongoose.model('Company', companySchema);
I store all my companies in one collection and each company has an array with references to its child companies. Then I want to display all companies as a tree(on client side). I want query all Main companies that populates their children and children populate their children and so on,with unlimited nesting level. How can I do that? Or maybe you know better approach. Also I need ability to view,add,edit,delete any company.
Now I have this:
router.get('/companies', function(req, res) {
Company.find({companyType: 'Main'}).populate({path: 'companyChildren'}).exec(function(err, list) {
if(err) {
console.log(err);
} else {
res.send(list);
}
})
});
But it populates only one nested level.
I appreciate any help
You can do this in latest Mongoose releases. No plugins required:
const async = require('async'),
mongoose = require('mongoose'),
Schema = mongoose.Schema;
const uri = 'mongodb://localhost/test',
options = { use: MongoClient };
mongoose.Promise = global.Promise;
mongoose.set('debug',true);
function autoPopulateSubs(next) {
this.populate('subs');
next();
}
const companySchema = new Schema({
name: String,
subs: [{ type: Schema.Types.ObjectId, ref: 'Company' }]
});
companySchema
.pre('findOne', autoPopulateSubs)
.pre('find', autoPopulateSubs);
const Company = mongoose.model('Company', companySchema);
function log(data) {
console.log(JSON.stringify(data, undefined, 2))
}
async.series(
[
(callback) => mongoose.connect(uri,options,callback),
(callback) =>
async.each(mongoose.models,(model,callback) =>
model.remove({},callback),callback),
(callback) =>
async.waterfall(
[5,4,3,2,1].map( name =>
( name === 5 ) ?
(callback) => Company.create({ name },callback) :
(child,callback) =>
Company.create({ name, subs: [child] },callback)
),
callback
),
(callback) =>
Company.findOne({ name: 1 })
.exec((err,company) => {
if (err) callback(err);
log(company);
callback();
})
],
(err) => {
if (err) throw err;
mongoose.disconnect();
}
)
Or a more modern Promise version with async/await:
const mongoose = require('mongoose'),
Schema = mongoose.Schema;
mongoose.set('debug',true);
mongoose.Promise = global.Promise;
const uri = 'mongodb://localhost/test',
options = { useMongoClient: true };
const companySchema = new Schema({
name: String,
subs: [{ type: Schema.Types.ObjectId, ref: 'Company' }]
});
function autoPopulateSubs(next) {
this.populate('subs');
next();
}
companySchema
.pre('findOne', autoPopulateSubs)
.pre('find', autoPopulateSubs);
const Company = mongoose.model('Company', companySchema);
function log(data) {
console.log(JSON.stringify(data, undefined, 2))
}
(async function() {
try {
const conn = await mongoose.connect(uri,options);
// Clean data
await Promise.all(
Object.keys(conn.models).map(m => conn.models[m].remove({}))
);
// Create data
await [5,4,3,2,1].reduce((acc,name) =>
(name === 5) ? acc.then( () => Company.create({ name }) )
: acc.then( child => Company.create({ name, subs: [child] }) ),
Promise.resolve()
);
// Fetch and populate
let company = await Company.findOne({ name: 1 });
log(company);
} catch(e) {
console.error(e);
} finally {
mongoose.disconnect();
}
})()
Produces:
{
"_id": "595f7a773b80d3114d236a8b",
"name": "1",
"__v": 0,
"subs": [
{
"_id": "595f7a773b80d3114d236a8a",
"name": "2",
"__v": 0,
"subs": [
{
"_id": "595f7a773b80d3114d236a89",
"name": "3",
"__v": 0,
"subs": [
{
"_id": "595f7a773b80d3114d236a88",
"name": "4",
"__v": 0,
"subs": [
{
"_id": "595f7a773b80d3114d236a87",
"name": "5",
"__v": 0,
"subs": []
}
]
}
]
}
]
}
]
}
Note that the async parts are not actually required at all and are just here for setting up the data for demonstration. It's the .pre() hooks that allow this to actually happen as we "chain" each .populate() which actually calls either .find() or .findOne() under the hood to another .populate() call.
So this:
function autoPopulateSubs(next) {
this.populate('subs');
next();
}
Is the part being invoked that is actually doing the work.
All done with "middleware hooks".
Data State
To make it clear, this is the data in the collection which is set up. It's just references pointing to each subsidiary in plain flat documents:
{
"_id" : ObjectId("595f7a773b80d3114d236a87"),
"name" : "5",
"subs" : [ ],
"__v" : 0
}
{
"_id" : ObjectId("595f7a773b80d3114d236a88"),
"name" : "4",
"subs" : [
ObjectId("595f7a773b80d3114d236a87")
],
"__v" : 0
}
{
"_id" : ObjectId("595f7a773b80d3114d236a89"),
"name" : "3",
"subs" : [
ObjectId("595f7a773b80d3114d236a88")
],
"__v" : 0
}
{
"_id" : ObjectId("595f7a773b80d3114d236a8a"),
"name" : "2",
"subs" : [
ObjectId("595f7a773b80d3114d236a89")
],
"__v" : 0
}
{
"_id" : ObjectId("595f7a773b80d3114d236a8b"),
"name" : "1",
"subs" : [
ObjectId("595f7a773b80d3114d236a8a")
],
"__v" : 0
}
I think a simpler approach would be to track the parent since that is unique instead of tracking an array of children which could get messy. There is a nifty module called mongoose-tree built just for this:
var tree = require('mongoose-tree');
var CompanySchema = new mongoose.Schema({
companyName: {
type: String,
required: true
},
estimatedAnnualEarnings: {
type: Number,
required: true
},
companyType: {type: String, enum: ['Main', 'Subsidiary']}
})
CompanySchema.plugin(tree);
module.exports = mongoose.model('Company', CompanySchema);
Set some test data:
var comp1 = new CompanySchema({name:'Company 1'});
var comp2 = new CompanySchema({name:'Company 2'});
var comp3 = new CompanySchema({name:'Company 3'});
comp3.parent = comp2;
comp2.parent = comp1;
comp1.save(function() {
comp2.save(function() {
comp3.save();
});
});
Then use mongoose-tree to build a function that can get either the ancestors or children:
router.get('/company/:name/:action', function(req, res) {
var name = req.params.name;
var action = req.params.action;
Company.find({name: name}, function(err, comp){
//typical error handling omitted for brevity
if (action == 'ancestors'){
comp.getAncestors(function(err, companies) {
// companies is an array
res.send(companies);
});
}else if (action == 'children'){
comp.getChildren(function(err, companies) {
res.send(companies);
});
}
});
});