sailsjs - one POST multiple records and associations created - node.js

Invoice-Model:
attributes: {
number: {
type: 'integer'
}
lines: {
collection: 'line',
via: 'invoice'
}
}
Line-Model:
attributes: {
name: {
type: 'integer'
}
invoice: {
model: 'invoice'
}
}
As you can see these models have a One-To-Many relationship. Everything is working fine.
But now I want to create a new Invoice and new Lines with the Blueprint API that are associated.
The documentation says you can create a new record and add it to an existing one with this schema: POST /:model/:id/:association/:fk
But it does not state if it is possible to create two records at the same time and associate them.
More details: I've got an invoice and in this invoice you can add lines with products, their quantity and other stuff. Now when the user clicks on save, I need to create a new invoice and the new lines and associate them somehow.
Should I create a custom Controller Action for this, or am I overthinking this and I should do this whole thing completely different?

To create a new invoice with new lines is possible to do in SailsJS with the following code:
Invoice.create({number: 1, lines: [
{
name: '1'
}
]})
This will create a new invoice with number 1 and also creates a new line with the name 1. Line 1 will be related with invoice.
Since lines is an collection you can add them as a array, so this makes it possible to add more than one Line to your Invoice.
You can overwrite your create function in the InvoiceController and add this code.
An alternative solution is using Promises. Make sure you install bluebird by using the command:
npm install bluebird
Put the following code in the top of your controller
var Promise = require('bluebird');
You can use the following code:
createWithPromises: function(req, res){
var lineName = 1;
var invoiceNumber = 2;
Invoice.create({number: invoiceNumber})
.then(function(result){
Line.create({name: lineName, invoice: result})
})
.then(function(result){
sails.log(result);
})
}
First it creates an Invoice with number 2. If succeed it will create a Line, and as parameter for invoice, you give the result from your previous create call.
For information about Promises check http://bluebirdjs.com/docs/getting-started.html

Related

SuiteScript Updating Search Result Fields

I have a search in SuiteScript 2.0 that's working fine. But for each record the search brings back, I want to update a particular field (I use it elsewhere to determine that the record has been examined). It's potentially a very large result set, so I have to page it. Here's my code:
var sResult = mySearch.runPaged({pageSize: 10});
for (var pageIndex = 0; pageIndex < sResult.pageRanges.length; pageIndex++)
{
var searchPage = sResult.fetch({ index: pageRange.index });
searchPage.data.forEach(function (result)
{
var name = result.getValue({ name: "altname"})
result.setValue({
name: 'tracker',
value: new Date()
})
});
}
You can see where I have a call to result.setValue(), which is a non-existent function. But it shows where I want to update the 'tracker' field and what data I want to use.
Am I barking up the wrong tree entirely here? How do I update the field for each result returned?
As Simon says you can't directly update a search result, but you can use submitFields method.
This example is from NetSuite documentation:
var otherId = record.submitFields({
type: 'customrecord_book', //record Type
id: '4', // record Id
values: {
'custrecord_rating': '2'
}
});
This approach will save more governance than load and save the record.
AFAIK You can't directly update a search result and write the value back to the record, the record needs to be loaded first. The snippet doesn't say what the type of record it is you're searching for or want to load, but the general idea is (in place of your result.setValue line):
var loadedRecord = Record.load({type:'myrecordtype', id:result.id});
loadedRecord.setValue( {fieldId:'tracker', value: new Date() });
loadedRecord.save();
Keep in mind SuiteScript governance and the number of records your modifying. Load & Save too many and your script will terminate earlier than you expect.
Bad design: instead of using result.setValue inside the iteration, push those to an "update" array then after the data.forEach have another function that loops thru the update array and processes them there with record.submitFields()
Be careful of governance....

Express Route Parameters creating duplicate custom paths

I am creating a to-do list app and I am trying to including custom routes using 'Express Route Parameters'. The program worked fine up until this point, but as soon as I tried to introduce these custom paths and log it to the console, or even add it to the database, the very first path gets added twice or logged twice. The paths added later are not duplicated, though.
app.get("/:customListName", function(req,res)
{
console.log(req.params.customListName);
/*const list = new List
({
name: customListName,
items: defaultItems
});
list.save();*/
});
For example, if the custom path added is called "home", i.e., "localhost:3000/home" and I am trying to console.log the name of the path, it will print "home" twice. Later, if I add paths like "work", "new" etc they are added (and printed) only once. Sometimes this error is also coming up:
BulkWriteError: E11000 duplicate key error collection: wolDB.items index: _id_ dup key: { _id: ObjectId('5ef4ad2110f45d54f143fa19') }
I have tried dropping the database and start afresh, drop indexes, even tried with a new database; but it seems the problem is not in the database because even without pushing it to the database, the problem persists. I tried coding the whole thing afresh, but the problem persists.
I have tried findOne() of mongoose also, but when I try to print whether the given route exists or not, for the first one it just prints "exists" twice.
List.findOne({name: customListName}, function(err, foundList)
{
if(!err)
{
if(!foundList)
console.log("Doesn't exist");
else
console.log("Exists");
}
});
Here's the GitHub link:
https://github.com/sebanti10/todolist.git
if you don't need the list and want to directly save to the database try using
List.create({
name: customListName,
items: defaultItems
})
Look at your schemas. Nowhere are you using new mongoose.Schema(). When you use mongoose.model("Item", itemsSchema) you need to pass in an actual schema, not just any object.
You need to change your so called schemes to something along the lines of:
const itemsSchema = new Schema({
name: String
});
Schema is available under mongoose, so you can either use mongoose.Schema directly or grab it const Schema = mongoose.Schema; or const { Schema } = mongoose;

Mongoose: Bulk upsert but only update records if they meet certain criteria

I am designing an item inventory system for a website that I am building.
The user's inventory is loaded from a Web API. This information is then processed so that it is more suited to my web app. I am trying to combine all the item records into one MongoDB collection - so other user inventories will be cached in the same place. What I have to deal with is deleting old item records if they are missing from the user's inventory (i.e. they sold it to someone) and also upserting the new items. Please note I have looked through several Stack Overflow questions about bulk upserts but I was unable to find anything about conditional updates.
Each item has two unique identifiers (classId and instanceId) that allow me to look them up (I have to use both IDs to match it) which remain constant. Some information about the item, such as its name, can change and therefore I want to be able to update those records when I fetch new inventory information. I also want new items that my site hasn't seen before to be added to my database.
Once the data returned from the Web API has been processed, it is left in a large array of objects. This means I am able to use bulk writing, however, I am unaware of how to upsert with conditions with multiple records.
Here is part of my item schema:
const ItemSchema = new mongoose.Schema({
ownerId: {
type: String,
required: true
},
classId: {
type: String,
required: true
},
instanceId: {
type: String,
required: true
},
name: {
type: String,
required: true
}
// rest of item attributes...
});
User inventories typically contain 600 or more items, with a max count of 2500.
What is the most efficient way of upserting this much data? Thank you
Update:
I have had trouble implementing the solution to the bulk insert problem. I made a few assumptions and I don't know if they were right. I interpreted _ as lodash, response.body as the JSON returned by the API and myListOfItems also as that same array of items.
import Item from "../models/item.model";
import _ from 'lodash';
async function storeInventory(items) {
let bulkUpdate = Item.collection.initializeUnorderedBulkOp();
_.forEach(items, (data) => {
if (data !== null) {
let newItem = new Item(data);
bulkUpdate.find({
classId: newItem.classId,
instanceId: newItem.instanceId
}).upsert().updateOne(newItem);
items.push(newItem);
}
});
await bulkUpdate.execute();
}
Whenever I run this code, it throws an error that complains about an _id field being changed, when the schema objects I created don't specify anything to do with schemas, and the few nested schema objects don't make a difference to the outcome when I change them to just objects.
I understand that if no _id is sent to MongoDB it auto generates one, but if it is updating a record it wouldn't do that anyway. I also tried setting _id to null on each item but to no avail.
Have I misunderstood anything about the accepted answer? Or is my problem elsewhere in my code?
This is how I do it :
let bulkUpdate = MyModel.collection.initializeUnorderedBulkOp();
//myItems is your array of items
_.forEach(myItems, (item) => {
if (item !== null) {
let newItem = new MyModel(item);
bulkUpdate.find({ yyy: newItem.yyy }).upsert().updateOne(newItem);
}
});
await bulkUpdate.execute();
I think the code is pretty readable and understandable. You can adjust it to make it work with your case :)

node not recognizing duplicated entries

I'm trying to create a basic MEAN stack CRUD api to add shops into my database. I want every shop to have a unique name (to avoid adding duplicates). So far, everything gets saved into the database even if I post the same request 10 times. Went trough the code a couple of times and can't figure out what's wrong, if anyone could point me in the right direction I'd be very grateful.
shop model:
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var bcrypt = require('bcrypt-nodejs');
//shop schema
var ShopSchema = new Schema({
name: { type: String, required: true, index: { unique: true }},
address: { type: String, required: true, index: { unique: true }}
});
module.exports = mongoose.model('Shop', ShopSchema);
post function:
apiRouter.route('/shops')
//create a shop
.post(function(req, res) {
//new instance of shop model
var shop = new Shop();
//set the shop information
shop.name = req.body.name;
shop.address = req.body.address;
//save shop and check for errors
shop.save(function(err) {
if(err) {
//duplicate entry
if(err.code == 11000) {
return res.json({ success: false, message: 'A shop with that name already exists.'});
}
else {
return res.send(err);
}
}
else {
res.json({ message:'Shop created! '});
}
});
})
I do not receive errors of any kind, like I said everything just gets written into the database.
Thanks for the help.
Basically your writes haven't finished before the new entries are saved. You can read more about creating unique keys Here, but the gist is below. The solution is to create an index over the unique fields ahead of time.
When we declare a property to be unique, we’re actually declaring that we want a database-level index on that property. Some database abstraction layers will issue a query to see if a there’s another record with the same value for the unique property, and if that query comes back empty, it allows the save or update to proceed. If you trust this method, you either have incredibly low traffic or you’re about to learn about race conditions, because 2 or more requests could have their checks to the database occur before any writes go out, and you end up with non-unique data in your DB.
In between the time that check query is issued, another insert could come along doing the exact same thing, and you still end up with duplication. Uniqueness can’t be correctly validated at the application level. So it’s good that Mongoose tries to create an index for us.

Mongoose key/val set on instance not show in JSON or Console.. why?

I have some information on my mongoose models which is transient. For performance reasons I dont wish to store it against the model.. But I do want to be able to provide this information to clients that connect to my server and ask for it.
Here's a simple example:
var mongoose = require('mongoose'),
db = require('./dbconn').dbconn;
var PersonSchema = new mongoose.Schema({
name : String,
age : Number,
});
var Person = db.model('Person', PersonSchema);
var fred = new Person({ name: 'fred', age: 100 });
The Person schema has two attributes that I want to store (name, and age).. This works.. and we see in the console:
console.log(fred);
{ name: 'fred', age: 100, _id: 509edc9d8aafee8672000001 }
I do however have one attribute ("status") that rapidly changes and I dont want to store this in the database.. but I do want to track it dynamically and provide it to clients so I add it onto the instance as a key/val pair.
fred.status = "alive";
If we look at fred in the console again after adding the "alive" key/val pair we again see fred, but his status isnt shown:
{ name: 'fred', age: 100, _id: 509edc9d8aafee8672000001 }
Yet the key/val pair is definitely there.. we see that:
console.log(fred.status);
renders:
alive
The same is true of the JSON representation of the object that I'm sending to clients.. the "status" isnt included..
I dont understand why.. can anyone help?
Or, alternatively, is there a better approach for adding attributes to mongoose schemas that aren't persisted to the database?
Adding the following to your schema should do what you want:
PersonSchema.virtual('status').get(function() {
return this._status;
});
PersonSchema.virtual('status').set(function(status) {
return this._status = status;
});
PersonSchema.set('toObject', {
getters: true
});
This adds the virtual attribute status - it will not be persisted because it's a virtual. The last part is needed to make your console log output correctly. From the docs:
To have all virtuals show up in your console.log output, set the
toObject option to { getters: true }
Also note that you need to use an internal property name other than status (here I used _status). If you use the same name, you will enter an infinite recursive loop when executing a get.
Simply call .toObject() on the data object.
For you code will be like:
fred.toObject()
This has been very helpful. I had to struggle with this myself.
In my case, I was getting a document from mongoose. When I added a new key, the key was not visible to the object if I console.log it. When I searched for the key (console.log(data.status), I could see it in the log but not visible if I logged the entire object.
After reading this response thread, it worked.
For example, I got an object like this one from my MongoDB call:
`Model.find({}).then(result=> {
//console.log(result);// [{ name:'John Doe', email:'john#john.com'}];
//To add another key to the result, I had to change that result like this:
var d = result[0];
var newData = d.toJSON();
newData["status"] = "alive";
console.log(newData);// { name:'John Doe', email:'john#john.com', status:'alive'};
}).catch(err=>console.log(err))`
Hope this helps someone else.
HappyCoding

Resources