Automate NodeJS Express Get and Post request using Cron - node.js

I have an existing get and post request from database which is:
router.post('/stackExample', async (req, res) => {
try {
//MAKE GET REQUEST FROM MONGODB
const stackCron = await Borrower.aggregate([
{ $unwind: { path: "$application", preserveNullAndEmptyArrays: true } },
{
$project: {
'branch': '$branch',
'status': '$application.status',
},
},
{ $match: { status: 'Active' } },
]);
//MAKE POST REQUEST TO MONGODB
for (let k = 0; k < stackCron.length; k++) {
const branch = stackCron[k].branch;
const status = stackCron[k].status;
const lrInterest = await Financial.updateOne({ accountName: 'Processing Fee Income'},
{
$push:
{
"transactions":
{
type: 'Credit',
firstName: 'SysGen',
lastName: 'SysGen2',
amount: 100,
date: new Date(),
}
}
})
}
res.json({ success: true, message: "Success" });
} catch (err) { res.json({ success: false, message: 'An error occured' }); }
});
This code works fine if request is made using the client but I want to automate this via cron:
Here is what I did:
var CronJob = require('cron').CronJob;
var job = new CronJob('* * * * * *', function () {
makeRequest()
}, null, true, 'America/Los_Angeles');
job.start();
function makeRequest(message){
//Copy-paste entire router post request.
}
There seems to be no response if I copy-paste my code in the function. What have I missed?

There is no response from a cron job because there is no request coming to your makeRequest function. That makes sense because a cron job is independent of any incoming requests.
One other reason, you might not be getting any data from your updateOne operation is that it doesn't return the updated document. It returns the status of that operation instead. Take a look here. If you want to get the updated document you might want to use findOneAndUpdate.
const response = await Todo.findOneAndUpdate(
{ _id: "a1s2d3f4f4d3s2a1s2d3f4" },
{ title: "Get Groceries" },
{ new: true }
);
// response will have updated document
// We won't need this here. This is just to tell you how to get the updated document without making another database query explicitly
The body of your router function is performing an async/await operation. But you didn't specify the makeRequest function to be async. This could also be the issue.
cron job will update the database but if you want to get the updated documents, you'll have to make a GET call to the server and define a new route, with required parameters/query.
Your makeRequest function will look something like this
async function makeRequest() {
try {
//MAKE GET REQUEST FROM MONGODB
const stackCron = await Borrower.aggregate([
{ $unwind: { path: "$application", preserveNullAndEmptyArrays: true } },
{
$project: {
branch: "$branch",
status: "$application.status",
},
},
{ $match: { status: "Active" } },
]);
//MAKE POST REQUEST TO MONGODB
for (let k = 0; k < stackCron.length; k++) {
const branch = stackCron[k].branch;
const status = stackCron[k].status;
const lrInterest = await Financial.updateOne(
{ accountName: "Processing Fee Income" },
{
$push: {
transactions: {
type: "Credit",
firstName: "SysGen",
lastName: "SysGen2",
amount: 100,
date: new Date(),
},
},
}
);
}
/**
* Write to a log file if you want to keep the record of this operation
*/
} catch (err) {
/**
* Similarly write the error to the same log file as well.
*/
}
}
In your cron job
var job = new CronJob(
"* * * * * *",
async function () {
await makeRequest();
},
null,
true,
"America/Los_Angeles"
);
Your new route
router.get("/stack/:accountName", async (req, res, next) => {
const { accountName } = req.params;
try {
const financial = await Financial.find({ accountName });
res.status(200).json({ message: "success", data: financial });
} catch (err) {
res.status(500).json({ message: "error", reason: err.message });
}
});
Simply call it as
fetch(
`http://example.net/stack/${encodeURIComponent("Processing Fee Income")}`,
{ method: "GET" }
);

Related

API Only sends 1 chunk of metadata when called

I have a problem with my API that sends metadata when called from my smart contract of website. Its NFT tokens and my database is postgres and API is node.js
The problem is when I mint 1 NFT metadata works perfect, but if I mint 2 or more it will only ever send 1 chunk of data? So only 1 NFT will mint properly and the rest with no data?
Do I need to set a loop function or delay? Does anyone have any experience with this?
Any help would be much appreciated.
Below is the code from the "controller" folder labeled "nft.js"
const models = require("../../models/index");
const path = require("path");
const fs = require("fs");
module.exports = {
create_nft: async (req, res, next) => {
try {
const dir = path.resolve(__dirname + `../../../data/traitsfinal.json`);
const readCards = fs.readFileSync(dir, "utf8");
const parsed = JSON.parse(readCards);
console.log("ya data ha final ??", parsed);
parsed.forEach(async (item) => {
// return res.json(item)
let newNft = await models.NFT.create({
name: item.Name,
description: item.Description,
background: item.Background,
body: item.Body,
mouth: item.Mouth,
eyes: item.Eyes,
head_gear: item.Head_Gear,
tokenId: item.tokenId,
image: item.imagesIPFS,
});
});
return res.json({
data: "nft created",
error: null,
success: true,
});
} catch (error) {
console.log("server error", error.message);
next(error);
}
},
get_nft: async (req, res, next) => {
try {
const { id } = req.params;
// console.log("id ?????????",id)
// console.log("type of ",typeof(id))
// const n=Number(id)
// console.log("type of ",typeof(id))
const nft = await models.NFT.findByPk(id);
if (!nft) {
throw new Error("Token ID invalid");
}
if (!nft.isMinted) {
throw new Error("Token not minted");
}
console.log(nft);
// }
const resObj = {
name: nft.name,
description: nft.description,
image: `https://gateway.pinata.cloud/ipfs/${nft.image}`,
attributes: [
{ trait_type: "background", value: `${nft.background}` },
{ trait_type: "body", value: `${nft.body}` },
{ trait_type: "mouth", value: `${nft.mouth}` },
{ trait_type: "eyes", value: `${nft.eyes}` },
{ trait_type: "tokenId", value: `${nft.tokenId}` },
{
display_type: "number",
trait_type: "Serial No.",
value: id,
max_value: 1000,
},
],
};
return res.json(resObj);
} catch (error) {
console.log("server error", error.message);
next(error);
}
},
get_nft_all: async (req, res, next) => {
try {
// console.log("id ?????????",id)
// console.log("type of ",typeof(id))
// const n=Number(id)
// console.log("type of ",typeof(id))
const nft = await models.NFT.findAndCountAll({
limit: 10
});
// console.log(nft);
if (!nft) {
throw new Error("Token ID invalid");
}
// if (nft.isMinted) {
// throw new Error("Token not minted");
// }
// console.log(nft);
// }
var resObjarr = [];
for (var i = 0; i < nft.rows.length; i++) {
resObj = {
name: nft.rows[i].name,
description: nft.rows[i].description,
image: `https://gateway.pinata.cloud/ipfs/${nft.rows[i].image}`,
attributes: [
{ trait_type: "background", value: `${nft.rows[i].background}` },
{ trait_type: "body", value: `${nft.rows[i].body}` },
{ trait_type: "mouth", value: `${nft.rows[i].mouth}` },
{ trait_type: "eyes", value: `${nft.rows[i].eyes}` },
{ trait_type: "tokenId", value: `${nft.rows[i].tokenId}` },
{
display_type: "number",
trait_type: "Serial No.",
value: nft.rows[i].id,
max_value: 1000,
},
],
};
resObjarr.push(resObj);
}
console.log(JSON.stringify(resObjarr))
return res.json(resObjarr);
} catch (error) {
console.log("server error", error.message);
next(error);
}
},
mint: async (req, res, next) => {
try {
const { id } = req.params;
const updated = await models.NFT.findByPk(id);
if (!updated) {
throw new Error("NFT ID invalid");
}
if (updated.isMinted) {
throw new Error("NFT Already minted");
}
updated.isMinted = true;
updated.save();
return res.json({
data: "Token minted successfully",
error: null,
success: true,
});
} catch (error) {
console.log("server error", error.message);
next(error);
}
},
};
Below is from the routes folder.
const router = require("express").Router();
const auth=require("../middleware/auth")
const {
create_nft,
get_nft,
get_nft_all,
mint
} = require("../controller/nft");
router.post(
"/create",
create_nft
);
router.get(
"/metadata/:id",
get_nft
);
router.get(
"/metadata",
get_nft_all
);
router.put(
"/mint/:id",
mint
);
module.exports = router;
Looking your code,you may having some kind of asyncrhonous issue in this part:
parsed.forEach(async (item) => {
// return res.json(item)
let newNft = await models.NFT.create({
name: item.Name,
description: item.Description,
background: item.Background,
body: item.Body,
mouth: item.Mouth,
eyes: item.Eyes,
head_gear: item.Head_Gear,
tokenId: item.tokenId,
image: item.imagesIPFS,
});
});
Because .forEach is a function to be used in synchronous context and NFT.create returns a promise (that is async). So things happens out of order.
So one approach is to process the data first and then perform a batch operation using Promise.all.
const data = parsed.map(item => {
return models.NFT.create({
name: item.Name,
description: item.Description,
background: item.Background,
body: item.Body,
mouth: item.Mouth,
eyes: item.Eyes,
head_gear: item.Head_Gear,
tokenId: item.tokenId,
image: item.imagesIPFS,
})
})
const results = await Promise.all(data)
The main difference here is Promise.all resolves the N promises NFT.create in an async context in paralell. But if you are careful about the number of concurrent metadata that data may be too big to process in parallel, then you can use an async iteration provided by bluebird's Promise.map library.
const Promise = require('bluebird')
const data = await Promise.map(parsed, item => {
return models.NFT.create({
name: item.Name,
description: item.Description,
background: item.Background,
body: item.Body,
mouth: item.Mouth,
eyes: item.Eyes,
head_gear: item.Head_Gear,
tokenId: item.tokenId,
image: item.imagesIPFS,
})
})
return data

Import large amounts of data, but do a .find() for each element

I have a collection of customers of 60.000 items. I need to import a list of people, of 50.000. But for each person, I need to find the ID of the customer and add that to the object that is being inserted.
How should this be done most efficently?
export default async ({ file, user, database }: Request, res: Response) => {
try {
const csv = file.buffer.toString("utf8");
let lines = await CSV({
delimiter: "auto" // delimiter used for separating columns.
}).fromString(csv);
let count = {
noCustomer: 0,
fails: 0
};
let data: any = [];
await Promise.all(
lines.map(async (item, index) => {
try {
// Find customer
const customer = await database
.model("Customer")
.findOne({
$or: [
{ "custom.pipeID": item["Organisasjon - ID"] },
{ name: item["Person - Organisasjon"] }
]
})
.select("_id")
.lean();
// If found a customer
if (!isNil(customer?._id)) {
return data.push({
name: item["Person - Navn"],
email: item["Person - Email"],
phone: item["Person - Telefon"],
customer: customer._id,
updatedAt: item["Person - Oppdater tid"],
createdAt: item["Person - Person opprettet"],
creator: users[item["Person - Eier"]] || "5e4bca71a31da7262c3707c5"
});
}
else {
return count.noCustomer++;
}
} catch (err) {
count.fails++;
return;
}
})
);
const people = await database.model("Person").insertMany(data)
res.send("Thanks!")
} catch (err) {
console.log(err)
throw err;
}
};
My code just never sends an response If I use this as a Express request.

MEAN stack how to find _id from database to send a PUT request

I'm having a problem identifying a 'task' in mongoDB from my frontend angular.
This question is the most similar to my question but here it just says req.body.id and doesn't really explain how they got that.
This question involves what I am trying to do: update one document in a collection upon a click. What it does in the frontend isn't important. I just want to change the status text of the Task from "Active" to "Completed" onclick.
First I create a task and stick it in my database collection with this code:
createTask(): void {
const status = "Active";
const taskTree: Task = {
_id: this._id,
author: this.username,
createdBy: this.department,
intendedFor: this.taskFormGroup.value.taskDepartment,
taskName: this.taskFormGroup.value.taskName,
taskDescription: this.taskFormGroup.value.taskDescription,
expectedDuration: this.taskFormGroup.value.expectedDuration,
status: status
};
this.http.post("/api/tasks", taskTree).subscribe(res => {
this.taskData = res;
});
}
When I make this post to the backend, _id is magically filled in!
I'm just not sure how I can pass the id to the put request in nodejs router.put('/:id') when I'm pushing it from the frontend like this:
completeTask(): void {
const status = "Completed";
const taskTree: Task = {
_id: this._id,
author: this.username,
createdBy: this.department,
intendedFor: this.taskFormGroup.value.taskDepartment,
taskName: this.taskFormGroup.value.taskName,
taskDescription: this.taskFormGroup.value.taskDescription,
expectedDuration: this.taskFormGroup.value.expectedDuration,
status: status
};
console.log(taskTree);
this.http.put("/api/tasks/" + taskTree._id, taskTree).subscribe(res => {
this.taskData = res;
console.log(res);
});
}
In the template I have a form that's filled in and the data is immediately outputted to a task 'card' on the same page.
When I send the put request from angular, I get a response in the backend just fine of the response I ask for in task-routes.js:
router.put("/:id", (req, res, next) => {
const taskData = req.body;
console.log(taskData);
const task = new Task({
taskId: taskData._id,
author: taskData.author,
createdBy: taskData.createdBy,
intendedFor: taskData.intendedFor,
taskName: taskData.taskName,
taskDescription: taskData.taskDescription,
expectedDuration: taskData.expectedDuration,
status: taskData.status
})
Task.updateOne(req.params.id, {
$set: task.status
},
{
new: true
},
function(err, updatedTask) {
if (err) throw err;
console.log(updatedTask);
}
)
});
The general response I get for the updated info is:
{
author: 'there's a name here',
createdBy: 'management',
intendedFor: null,
taskName: null,
taskDescription: null,
expectedDuration: null,
status: 'Completed'
}
Now I know _id is created automatically in the database so here when I click create task & it outputs to the 'card', in the console log of task after I save() it on the post request, taskId: undefined comes up. This is all fine and dandy but I have to send a unique identifier from the frontend Task interface so when I send the 'put' request, nodejs gets the same id as was 'post'ed.
I'm quite confused at this point.
So I finally figured this out...In case it helps someone here's what finally worked:
First I moved my update function and (patch instead of put) request to my trigger service:
Trigger Service
tasks: Task[] = [];
updateTask(taskId, data): Observable<Task> {
return this.http.patch<Task>(this.host + "tasks/" + taskId, data);
}
I also created a get request in the trigger service file to find all the documents in a collection:
getTasks() {
return this.http.get<Task[]>(this.host + "tasks");
}
Angular component
Get tasks in ngOnInit to list them when the component loads:
ngOnInit() {
this.triggerService.getTasks().subscribe(
tasks => {
this.tasks = tasks as Task[];
console.log(this.tasks);
},
error => console.error(error)
);
}
Update:
completeTask(taskId, data): any {
this.triggerService.updateTask(taskId, data).subscribe(res => {
console.log(res);
});
}
Angular template (html)
<button mat-button
class="btn btn-lemon"
(click)="completeTask(task._id)"
>Complete Task</button>
// task._id comes from `*ngFor="task of tasks"`, "tasks" being the name of the array
//(or interface array) in your component file. "task" is any name you give it,
//but I think the singular form of your array is the normal practice.
Backend Routes
GET all tasks:
router.get("", (req, res, next) => {
Task.find({})
.then(tasks => {
if (tasks) {
res.status(200).json(tasks);
} else {
res.status(400).json({ message: "all tasks not found" });
}
})
.catch(error => {
response.status(500).json({
message: "Fetching tasks failed",
error: error
});
});
});
Update 1 field in specified document (status from "Active" to "Completed"):
router.patch("/:id", (req, res, next) => {
const status = "Completed";
console.log(req.params.id + " IT'S THE ID ");
Task.updateOne(
{ _id: req.params.id },
{ $set: { status: status } },
{ upsert: true }
)
.then(result => {
if (result.n > 0) {
res.status(200).json({
message: "Update successful!"
});
}
})
.catch(error => {
res.status(500).json({
message: "Failed updating the status.",
error: error
});
});
});
Hope it helps someone!

GraphQL with RESTful returning empty response

I am connecting GraphQL with REST endpoints, I confirmed that whenever I am calling http://localhost:3001/graphql it is hitting REST endpoint and it is returning JSON response to GraphQL server, but I am getting an empty response from GraphQL server to GUI as follows:
{
"data": {
"merchant": {
"id": null
}
}
}
Query (decoded manually):
http://localhost:3001/graphql?query={
merchant(id: 1) {
id
}
}
Below is how my GraphQLObjectType looks like:
const MerchantType = new GraphQLObjectType({
name: 'Merchant',
description: 'Merchant details',
fields : () => ({
id : {
type: GraphQLString // ,
// resolve: merchant => merchant.id
},
email: {type: GraphQLString}, // same name as field in REST response, so resolver is not requested
mobile: {type: GraphQLString}
})
});
const QueryType = new GraphQLObjectType({
name: 'Query',
description: 'The root of all... queries',
fields: () => ({
merchant: {
type: merchant.MerchantType,
args: {
id: {type: new GraphQLNonNull(GraphQLID)},
},
resolve: (root, args) => rest.fetchResponseByURL(`merchant/${args.id}/`)
},
}),
});
Response from REST endpoint (I also tried with single object in JSON instead of JSON array):
[
{
"merchant": {
"id": "1",
"email": "a#b.com",
"mobile": "1234567890"
}
}
]
REST call using node-fetch
function fetchResponseByURL(relativeURL) {
return fetch(`${config.BASE_URL}${relativeURL}`, {
method: 'GET',
headers: {
Accept: 'application/json',
}
})
.then(response => {
if (response.ok) {
return response.json();
}
})
.catch(error => { console.log('request failed', error); });
}
const rest = {
fetchResponseByURL
}
export default rest
GitHub: https://github.com/vishrantgupta/graphql
JSON endpoint (dummy): https://api.myjson.com/bins/8lwqk
Edit: Adding node.js tag, may be issue with promise object.
Your fetchResponseByURL function get empty string.
I think the main problem is that you are using wrong function to get the your JSON string, please try to install request-promise and use it to get your JSON string.
https://github.com/request/request-promise#readme
something like
var rp = require('request-promise');
function fetchResponseByURL(relativeURL) {
return rp('https://api.myjson.com/bins/8lwqk')
.then((html) => {
const data = JSON.parse(html)
return data.merchant
})
.catch((err) => console.error(err));
// .catch(error => { console.log('request failed', error); });
}
In this case using data.merchant solved my problem. But the above suggested solution i.e., use of JSON.parse(...) might not be the best practice because if there are no object in JSON, then expected response might be as follows:
{
"data": {
"merchant": null
}
}
Instead of fields to be null.
{
"data": {
"merchant": {
"id": null // even though merchant is null in JSON,
// I am getting a merchant object in response from GraphQL
}
}
}
I have updated my GitHub: https://github.com/vishrantgupta/graphql with working code.

Routing to sub docs with express 4 and mongoose

EDIT: It's possible the problem is an issue with pathing. my current query looks like this:
router.route('/projects/:project_id/techDetails')
.get(function(req, res) {
Project.findById(req.params.project_Id, function(err, project) {
if (err)
return res.send(err);
res.json(project);
console.log('get success (project techDetails)');
});
});
this returns null. even though it's identical to a working line of code in every way except for the addition of `/techDetails' to the route.
original question:
I'm building a MEAN stack app with express and mongo. I can't figure out how to route to nested documents properly.
here is my Project schema:
const ProjectSchema = new Schema({
idnumber: { type: Number, required: true },
customername: String,
projectdetails: String,
jobaddress: String,
techDetails: [{
scope: String,
edgedetail: String,
lamination: String,
stonecolour: String,
slabnumber: String,
slabsupplier: String,
purchaseordernum: String,
splashbacks: String,
apron: String,
hotplate: String,
sink: String,
sinkdetails: String,
tappos: String
}],
sitecontactname: String,
sitecontactnum: String,
specialreq: String,
install_date: String,
created_on: { type: Date, default: Date.now },
created_by: { type: String, default: 'SYSTEM' },
active: { type: Boolean, default: true },
flagged: { type: Boolean, default: false },
});
I can successfully route to /projects with GET and POST, and /projects/:project_id with GET, PUT and DEL.
using the PUT route and a project's _ID i can push new entries to a project's techDetails subdoc array. the resulting JSON data looks like this:
{
"_id": "59e577e011a3f512b482ef13",
"idnumber": 52,
"install_date": "10/20/2017",
"specialreq": "some...",
"sitecontactnum": "987654321",
"sitecontactname": "bill",
"jobaddress": "123 st",
"projectdetails": "some stuff",
"customername": "B Builders",
"__v": 16,
"flagged": false,
"active": true,
"created_by": "SYSTEM",
"created_on": "2017-10-17T03:24:16.423Z",
"techDetails": [
{
"scope": "Howitzer",
"edgedetail": "12mm",
"lamination": "No",
"stonecolour": "Urban™",
"slabnumber": "1",
"slabsupplier": "Caesarstone",
"purchaseordernum": "no",
"splashbacks": "No",
"apron": "No",
"hotplate": "N/A",
"sink": "N/A",
"sinkdetails": "no",
"tappos": "no",
"_id": "59e577e011a3f512b482ef14"
},
{
"scope": "kitchen",
"edgedetail": "12mm",
"lamination": "etc",
"_id": "59e7da445d9d7e109c18f38b"
},
{
"scope": "Vanity",
"edgedetail": "12mm",
"lamination": "No",
"stonecolour": "Linen™",
"slabnumber": "1",
"slabsupplier": "Caesarstone",
"purchaseordernum": "1",
"splashbacks": "No",
"apron": "No",
"hotplate": "N/A",
"sink": "N/A",
"sinkdetails": "no",
"tappos": "woo",
"_id": "59e81e3324fb750fb46f8248"
}//, more entries omitted for brevity
]
}
as you can see everything so far is working as expected. However now i need to edit and delete individual entries in this techDetails array. i'd also like to route to them directly using projects/:project_id/techDetails and projects/:project_id/techDetails/:techdetails_id.
From what i can see there are two approaches to this. either i can:
A) use a new routing file for the techDetails that uses mergeParams. this is the approach i'm trying currently, however I can't figure out how to complete the .find to return all techDetails, since i can only use the Project model schema and i'm unsure how to access the sub docs.
an excerpt from my routes.js:
const techDetails = require('./techDetails');
//other routes here
//see techdetails file
router.use('/projects/:project_id/techdetails', techDetails);
//here lies an earlier, failed attempt
/* router.route('/projects/:project_id/techdetails/:techDetails_id')
.get(function(req, res) {
Project.findById(req.params.project_id.techDetails_id, function(err,
project) {
if (err)
return res.send(err);
res.json(project.techDetails);
console.log('get success (techDetails)');
});
})
; */
and my techdetails.js:
const express = require('express');
const Project = require('./models/project');
const router = express.Router({mergeParams: true});
router.get('/', function (req, res, next) {
/* Project.find(function(err, techDetails) {
if (err)
return res.send(err);
res.json(techDetails);
console.log('get success (all items)');
}); */
res.send('itemroutes ' + req.params);
})
router.get('/:techDetails_id', function (req, res, next) {
res.send('itemroutes ' + req.params._id)
})
module.exports = router
I can successfully check that the routes work with Postman, both will receive the response. now the problem is, instead of res.send i want to use res.json with Project.find (or similar) to get the techDetails.
however there is also another option:
B) put the techDetails document into it's own schema and then populate an array of IDs inside projects.
however this seems more complex so i'd rather avoid having to do so if i can.
any thoughts and suggestions welcome. let me know if more of my code is needed.
In this particular case I would put techDetails in a separate schema:
const ProjectSchema = new Schema({
idnumber: { type: Number, required: true },
customername: String,
projectdetails: String,
jobaddress: String,
techDetails: [techDetailsSchema],
sitecontactname: String,
sitecontactnum: String,
specialreq: String,
install_date: String,
created_on: { type: Date, default: Date.now },
created_by: { type: String, default: 'SYSTEM' },
active: { type: Boolean, default: true },
flagged: { type: Boolean, default: false },
});
Don't register the techDetails schema with mongoose.model as it is a subdocument. Put it in a separate file and require it in the project model file (const techDetailsSchema = require('./techDetails.model');).
I would create the controller functions like this:
Getting with GET (all):
module.exports.techDetailsGetAll = function (req, res) {
const projectId = req.params.projectId;
Project
.findById(projectId)
.select('techDetails')
.exec(function (err, project) {
let response = { };
if (err) {
response = responseDueToError(err);
} else if (!project) {
response = responseDueToNotFound();
} else {
response.status = HttpStatus.OK;
response.message = project.techDetails;
}
res.status(response.status).json(response.message);
});
}
Getting with GET (one):
module.exports.techDetailsGetOne = function (req, res) {
const projectId = req.params.projectId;
const techDetailId = req.params.techDetailId;
Project
.findById(projectId)
.select('techDetails')
.exec(function (err, project) {
let response = { };
if (err) {
response = responseDueToError(err);
} else if (!project) {
response = responseDueToNotFound();
} else {
let techDetails = project.techDetails.id(techDetailId);
if (techDetails === null) {
response = responseDueToNotFound();
} else {
response.status = HttpStatus.OK;
response.message = techDetails;
}
}
res.status(response.status).json(response.message);
});
}
For adding with POST:
module.exports.techDetailsAddOne = function (req, res) {
const projectId = req.params.projectId;
let newTechDetails = getTechDetailsFromBody(req.body);
Project
.findByIdAndUpdate(projectId,
{ '$push': { 'techDetails': newTechDetails } },
{
'new': true,
'runValidators': true
},
function (err, project) {
let response = { };
if (err) {
response = responseDueToError(err);
} else if (!project) {
response = responseDueToNotFound();
} else {
response.status = HttpStatus.CREATED;
response.message = project.techDetails; // for example
}
res.status(response.status).json(response.message);
});
}
For updating with PUT
module.exports.techDetailsUpdateOne = function (req, res) {
const projectId = req.params.projectId;
const techDetailId = req.params.techDetailId;
let theseTechDetails = getTechDetailsFromBody(req.body);
theseTechDetails._id = techDetailId; // can be skipped if body contains id
Project.findOneAndUpdate(
{ '_id': projectId, 'techDetails._id': techDetailId },
{ '$set': { 'techDetails.$': theseTechDetails } },
{
'new': true,
'runValidators': true
},
function (err, project) {
let response = { };
if (err) {
response = responseDueToError(err);
res.status(response.status).json(response.message);
} else if (!project) {
response = responseDueToNotFound();
res.status(response.status).json(response.message);
} else {
project.save(function (err) {
if (err) {
response = responseDueToError(err);
} else {
response.status = HttpStatus.NO_CONTENT;
}
res.status(response.status).json(response.message);
})
}
});
}
And deleting with DELETE:
module.exports.techDetailsDeleteOne = function (req, res) {
const projectId = req.params.projectId;
const techDetailId = req.params.techDetailId;
Project
.findById(projectId)
.select('techDetails')
.exec(function (err, project) {
let response = { }
if (err) {
response = responseDueToError(err);
res.status(response.status).json(response.message);
} else if (!project) {
response = responseDueToNotFound();
res.status(response.status).json(response.message);
} else {
let techDetail = project.techDetails.id(techDetailId);
if (techDetail !== null) {
project.techDetails.pull({ '_id': techDetailId });
project.save(function (err) {
if (err) {
response = responseDueToError(err);
} else {
response.status = HttpStatus.NO_CONTENT;
}
res.status(response.status).json(response.message);
})
} else {
response = responseDueToNotFound();
res.status(response.status).json(response.message);
}
}
});
}
And finally routing like this:
router
.route('/projects')
.get(ctrlProjects.projectsGetAll)
.post(ctrlProjects.projectsAddOne);
router
.route('/projects/:projectId')
.get(ctrlProjects.projectsGetOne)
.put(ctrlProjects.projectsUpdateOne)
.delete(ctrlProjects.projectsDeleteOne);
router
.route('/projects/:projectId/techDetails')
.get(ctrlTechDetails.techDetailsGetAll)
.post(ctrlTechDetails.techDetailsAddOne);
router
.route('/projects/:projectId/techDetails/:techDetailId')
.get(ctrlTechDetails.techDetailsGetOne)
.put(ctrlTechDetails.techDetailsUpdateOne)
.delete(ctrlTechDetails.techDetailsDeleteOne);
This is what I prefer when I'm constantly updating the subdocument independently of the rest of the document. It doesn't create a separate collection, so no need for populate.
EDIT:
This answer goes more into detail on whether you should use embedding or referencing. My answer uses embedding.
So, the solution i came to was a combo of A) and B). I used a separate routing file and put ({mergeParams: true}) in the router declaration, and i created a separate file for the techDetails nested model, without declaring it. However I don't believe either of these actually made any significance... but anyway.
the working code i ended up with was, in my routes:
router.use('/projects/:project_id/techDetails', TechDetails);
and in techDetails.js:
const router = express.Router({mergeParams: true});
router.route('/')
.get(function(req, res) {
Project.findById(req.params.project_id,
'techDetails', function(err, project) {
if (err)
return res.send(err);
res.json(project);
console.log('get success (project techDetails)');
});
});
What's different about it? namely, the 'techDetails', parameter in the Project.findById line. According to the mongoose API this acts as a select statement. The only other major difference is I fixed a typo in my original code ( project_id was written project_Id. dubious... ). I probably would have noticed this if i was using VS or something instead of notepad++, but it is my preferred coding arena.
It may be possible to return res.json(project.techDetails) and remove the 'techDetails', select parameter, but I likely won't test this.
Edit: Turns out migrating techDetails to a separate file meant they no longer generated with objectIds, which is crucial for PUT and DEL. I might've been able to work around them with a simple pair of curly braces inside the array declaration, but I didn't think of that until after i re-migrated it back to the project schema...

Resources